[DEPRECATION WARNING]: ANSIBLE_COLLECTIONS_PATHS option, does not fit var naming standard, use the singular form ANSIBLE_COLLECTIONS_PATH instead. This feature will be removed from ansible-core in version 2.19. Deprecation warnings can be disabled by setting deprecation_warnings=False in ansible.cfg. 13273 1726853281.18860: starting run ansible-playbook [core 2.17.4] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-Qi7 executable location = /usr/local/bin/ansible-playbook python version = 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] (/usr/bin/python3.12) jinja version = 3.1.4 libyaml = True No config file found; using defaults 13273 1726853281.19725: Added group all to inventory 13273 1726853281.19727: Added group ungrouped to inventory 13273 1726853281.19731: Group all now contains ungrouped 13273 1726853281.19734: Examining possible inventory source: /tmp/network-iHm/inventory.yml 13273 1726853281.49033: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/cache 13273 1726853281.49095: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py 13273 1726853281.49122: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory 13273 1726853281.49188: Loading InventoryModule 'host_list' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py 13273 1726853281.49266: Loaded config def from plugin (inventory/script) 13273 1726853281.49270: Loading InventoryModule 'script' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py 13273 1726853281.49309: Loading InventoryModule 'auto' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py 13273 1726853281.49398: Loaded config def from plugin (inventory/yaml) 13273 1726853281.49400: Loading InventoryModule 'yaml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py 13273 1726853281.49488: Loading InventoryModule 'ini' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/ini.py 13273 1726853281.49919: Loading InventoryModule 'toml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/toml.py 13273 1726853281.49922: Attempting to use plugin host_list (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py) 13273 1726853281.49925: Attempting to use plugin script (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py) 13273 1726853281.49931: Attempting to use plugin auto (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py) 13273 1726853281.49935: Loading data from /tmp/network-iHm/inventory.yml 13273 1726853281.50009: /tmp/network-iHm/inventory.yml was not parsable by auto 13273 1726853281.50067: Attempting to use plugin yaml (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py) 13273 1726853281.50115: Loading data from /tmp/network-iHm/inventory.yml 13273 1726853281.50200: group all already in inventory 13273 1726853281.50206: set inventory_file for managed_node1 13273 1726853281.50210: set inventory_dir for managed_node1 13273 1726853281.50210: Added host managed_node1 to inventory 13273 1726853281.50213: Added host managed_node1 to group all 13273 1726853281.50213: set ansible_host for managed_node1 13273 1726853281.50214: set ansible_ssh_extra_args for managed_node1 13273 1726853281.50217: set inventory_file for managed_node2 13273 1726853281.50220: set inventory_dir for managed_node2 13273 1726853281.50220: Added host managed_node2 to inventory 13273 1726853281.50222: Added host managed_node2 to group all 13273 1726853281.50222: set ansible_host for managed_node2 13273 1726853281.50223: set ansible_ssh_extra_args for managed_node2 13273 1726853281.50225: set inventory_file for managed_node3 13273 1726853281.50227: set inventory_dir for managed_node3 13273 1726853281.50228: Added host managed_node3 to inventory 13273 1726853281.50229: Added host managed_node3 to group all 13273 1726853281.50230: set ansible_host for managed_node3 13273 1726853281.50230: set ansible_ssh_extra_args for managed_node3 13273 1726853281.50232: Reconcile groups and hosts in inventory. 13273 1726853281.50236: Group ungrouped now contains managed_node1 13273 1726853281.50237: Group ungrouped now contains managed_node2 13273 1726853281.50239: Group ungrouped now contains managed_node3 13273 1726853281.50317: '/usr/local/lib/python3.12/site-packages/ansible/plugins/vars/__init__' skipped due to reserved name 13273 1726853281.50444: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments 13273 1726853281.50493: Loading ModuleDocFragment 'vars_plugin_staging' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/vars_plugin_staging.py 13273 1726853281.50528: Loaded config def from plugin (vars/host_group_vars) 13273 1726853281.50531: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=False, class_only=True) 13273 1726853281.50537: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/vars 13273 1726853281.50545: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 13273 1726853281.50588: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py (found_in_cache=True, class_only=False) 13273 1726853281.50932: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853281.51035: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py 13273 1726853281.51084: Loaded config def from plugin (connection/local) 13273 1726853281.51088: Loading Connection 'local' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/local.py (found_in_cache=False, class_only=True) 13273 1726853281.51836: Loaded config def from plugin (connection/paramiko_ssh) 13273 1726853281.51839: Loading Connection 'paramiko_ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/paramiko_ssh.py (found_in_cache=False, class_only=True) 13273 1726853281.53282: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 13273 1726853281.53438: Loaded config def from plugin (connection/psrp) 13273 1726853281.53441: Loading Connection 'psrp' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/psrp.py (found_in_cache=False, class_only=True) 13273 1726853281.54776: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 13273 1726853281.54816: Loaded config def from plugin (connection/ssh) 13273 1726853281.54819: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=False, class_only=True) 13273 1726853281.56919: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 13273 1726853281.56957: Loaded config def from plugin (connection/winrm) 13273 1726853281.56960: Loading Connection 'winrm' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/winrm.py (found_in_cache=False, class_only=True) 13273 1726853281.56994: '/usr/local/lib/python3.12/site-packages/ansible/plugins/shell/__init__' skipped due to reserved name 13273 1726853281.57066: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py 13273 1726853281.57141: Loaded config def from plugin (shell/cmd) 13273 1726853281.57143: Loading ShellModule 'cmd' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/cmd.py (found_in_cache=False, class_only=True) 13273 1726853281.57169: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py (found_in_cache=True, class_only=False) 13273 1726853281.57241: Loaded config def from plugin (shell/powershell) 13273 1726853281.57243: Loading ShellModule 'powershell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/powershell.py (found_in_cache=False, class_only=True) 13273 1726853281.57296: Loading ModuleDocFragment 'shell_common' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_common.py 13273 1726853281.57486: Loaded config def from plugin (shell/sh) 13273 1726853281.57488: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=False, class_only=True) 13273 1726853281.57520: '/usr/local/lib/python3.12/site-packages/ansible/plugins/become/__init__' skipped due to reserved name 13273 1726853281.57646: Loaded config def from plugin (become/runas) 13273 1726853281.57648: Loading BecomeModule 'runas' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/runas.py (found_in_cache=False, class_only=True) 13273 1726853281.57842: Loaded config def from plugin (become/su) 13273 1726853281.57844: Loading BecomeModule 'su' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/su.py (found_in_cache=False, class_only=True) 13273 1726853281.58014: Loaded config def from plugin (become/sudo) 13273 1726853281.58017: Loading BecomeModule 'sudo' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/sudo.py (found_in_cache=False, class_only=True) running playbook inside collection fedora.linux_system_roles 13273 1726853281.58049: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_removal_nm.yml 13273 1726853281.58394: in VariableManager get_vars() 13273 1726853281.58414: done with get_vars() 13273 1726853281.58555: trying /usr/local/lib/python3.12/site-packages/ansible/modules 13273 1726853281.61757: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action 13273 1726853281.61887: in VariableManager get_vars() 13273 1726853281.62008: done with get_vars() 13273 1726853281.62011: variable 'playbook_dir' from source: magic vars 13273 1726853281.62012: variable 'ansible_playbook_python' from source: magic vars 13273 1726853281.62013: variable 'ansible_config_file' from source: magic vars 13273 1726853281.62014: variable 'groups' from source: magic vars 13273 1726853281.62015: variable 'omit' from source: magic vars 13273 1726853281.62015: variable 'ansible_version' from source: magic vars 13273 1726853281.62016: variable 'ansible_check_mode' from source: magic vars 13273 1726853281.62017: variable 'ansible_diff_mode' from source: magic vars 13273 1726853281.62018: variable 'ansible_forks' from source: magic vars 13273 1726853281.62018: variable 'ansible_inventory_sources' from source: magic vars 13273 1726853281.62019: variable 'ansible_skip_tags' from source: magic vars 13273 1726853281.62020: variable 'ansible_limit' from source: magic vars 13273 1726853281.62020: variable 'ansible_run_tags' from source: magic vars 13273 1726853281.62021: variable 'ansible_verbosity' from source: magic vars 13273 1726853281.62057: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml 13273 1726853281.64626: in VariableManager get_vars() 13273 1726853281.64645: done with get_vars() 13273 1726853281.64656: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml statically imported: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml 13273 1726853281.66352: in VariableManager get_vars() 13273 1726853281.66367: done with get_vars() 13273 1726853281.66377: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml statically imported: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 13273 1726853281.66709: in VariableManager get_vars() 13273 1726853281.66727: done with get_vars() 13273 1726853281.67077: in VariableManager get_vars() 13273 1726853281.67093: done with get_vars() 13273 1726853281.67102: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml statically imported: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 13273 1726853281.67181: in VariableManager get_vars() 13273 1726853281.67198: done with get_vars() 13273 1726853281.67898: in VariableManager get_vars() 13273 1726853281.67920: done with get_vars() 13273 1726853281.67925: variable 'omit' from source: magic vars 13273 1726853281.67945: variable 'omit' from source: magic vars 13273 1726853281.67981: in VariableManager get_vars() 13273 1726853281.67993: done with get_vars() 13273 1726853281.68156: in VariableManager get_vars() 13273 1726853281.68169: done with get_vars() 13273 1726853281.68267: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 13273 1726853281.68719: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 13273 1726853281.68959: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 13273 1726853281.74916: in VariableManager get_vars() 13273 1726853281.74939: done with get_vars() 13273 1726853281.76441: trying /usr/local/lib/python3.12/site-packages/ansible/modules/__pycache__ 13273 1726853281.76779: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__ redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 13273 1726853281.80083: in VariableManager get_vars() 13273 1726853281.80107: done with get_vars() 13273 1726853281.80117: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml statically imported: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 13273 1726853281.80394: in VariableManager get_vars() 13273 1726853281.80414: done with get_vars() 13273 1726853281.80645: in VariableManager get_vars() 13273 1726853281.80662: done with get_vars() 13273 1726853281.81253: in VariableManager get_vars() 13273 1726853281.81270: done with get_vars() 13273 1726853281.81278: variable 'omit' from source: magic vars 13273 1726853281.81289: variable 'omit' from source: magic vars 13273 1726853281.81627: variable 'controller_profile' from source: play vars 13273 1726853281.81790: in VariableManager get_vars() 13273 1726853281.81804: done with get_vars() 13273 1726853281.81825: in VariableManager get_vars() 13273 1726853281.81839: done with get_vars() 13273 1726853281.81988: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 13273 1726853281.82236: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 13273 1726853281.82423: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 13273 1726853281.83220: in VariableManager get_vars() 13273 1726853281.83241: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 13273 1726853281.88873: in VariableManager get_vars() 13273 1726853281.88898: done with get_vars() 13273 1726853281.88903: variable 'omit' from source: magic vars 13273 1726853281.88915: variable 'omit' from source: magic vars 13273 1726853281.88945: in VariableManager get_vars() 13273 1726853281.88963: done with get_vars() 13273 1726853281.88990: in VariableManager get_vars() 13273 1726853281.89009: done with get_vars() 13273 1726853281.89039: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 13273 1726853281.89356: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 13273 1726853281.89450: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 13273 1726853281.90244: in VariableManager get_vars() 13273 1726853281.90266: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 13273 1726853281.94998: in VariableManager get_vars() 13273 1726853281.95028: done with get_vars() 13273 1726853281.95034: variable 'omit' from source: magic vars 13273 1726853281.95046: variable 'omit' from source: magic vars 13273 1726853281.95078: in VariableManager get_vars() 13273 1726853281.95321: done with get_vars() 13273 1726853281.95344: in VariableManager get_vars() 13273 1726853281.95368: done with get_vars() 13273 1726853281.95557: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 13273 1726853281.95795: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 13273 1726853281.96206: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 13273 1726853281.97435: in VariableManager get_vars() 13273 1726853281.97465: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 13273 1726853282.03253: in VariableManager get_vars() 13273 1726853282.03393: done with get_vars() 13273 1726853282.03400: variable 'omit' from source: magic vars 13273 1726853282.03425: variable 'omit' from source: magic vars 13273 1726853282.03466: in VariableManager get_vars() 13273 1726853282.03604: done with get_vars() 13273 1726853282.03624: in VariableManager get_vars() 13273 1726853282.03654: done with get_vars() 13273 1726853282.03821: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 13273 1726853282.03999: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 13273 1726853282.04086: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 13273 1726853282.05123: in VariableManager get_vars() 13273 1726853282.05153: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 13273 1726853282.07615: in VariableManager get_vars() 13273 1726853282.07653: done with get_vars() 13273 1726853282.07662: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml statically imported: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml 13273 1726853282.08270: in VariableManager get_vars() 13273 1726853282.08307: done with get_vars() 13273 1726853282.08369: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback 13273 1726853282.08387: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__ redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug 13273 1726853282.08648: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py 13273 1726853282.08821: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.debug) 13273 1726853282.08823: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.debug' from /tmp/collections-Qi7/ansible_collections/ansible/posix/plugins/callback/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) 13273 1726853282.09007: '/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__init__' skipped due to reserved name 13273 1726853282.09033: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py (found_in_cache=True, class_only=False) 13273 1726853282.09258: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py 13273 1726853282.09332: Loaded config def from plugin (callback/default) 13273 1726853282.09336: Loading CallbackModule 'default' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/default.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 13273 1726853282.10694: Loaded config def from plugin (callback/junit) 13273 1726853282.10698: Loading CallbackModule 'junit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/junit.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 13273 1726853282.10778: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py (found_in_cache=True, class_only=False) 13273 1726853282.10885: Loaded config def from plugin (callback/minimal) 13273 1726853282.10888: Loading CallbackModule 'minimal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/minimal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 13273 1726853282.10924: Loading CallbackModule 'oneline' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/oneline.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 13273 1726853282.11014: Loaded config def from plugin (callback/tree) 13273 1726853282.11017: Loading CallbackModule 'tree' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/tree.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) redirecting (type: callback) ansible.builtin.profile_tasks to ansible.posix.profile_tasks 13273 1726853282.11154: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.profile_tasks) 13273 1726853282.11157: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.profile_tasks' from /tmp/collections-Qi7/ansible_collections/ansible/posix/plugins/callback/profile_tasks.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_bond_removal_nm.yml ******************************************** 2 plays in /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_removal_nm.yml 13273 1726853282.11221: in VariableManager get_vars() 13273 1726853282.11238: done with get_vars() 13273 1726853282.11247: in VariableManager get_vars() 13273 1726853282.11258: done with get_vars() 13273 1726853282.11267: variable 'omit' from source: magic vars 13273 1726853282.11326: in VariableManager get_vars() 13273 1726853282.11348: done with get_vars() 13273 1726853282.11379: variable 'omit' from source: magic vars PLAY [Run playbook 'playbooks/tests_bond_removal.yml' with nm as provider] ***** 13273 1726853282.12022: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy 13273 1726853282.12107: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py 13273 1726853282.12139: getting the remaining hosts for this loop 13273 1726853282.12141: done getting the remaining hosts for this loop 13273 1726853282.12156: getting the next task for host managed_node3 13273 1726853282.12161: done getting next task for host managed_node3 13273 1726853282.12163: ^ task is: TASK: Gathering Facts 13273 1726853282.12164: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853282.12167: getting variables 13273 1726853282.12168: in VariableManager get_vars() 13273 1726853282.12179: Calling all_inventory to load vars for managed_node3 13273 1726853282.12182: Calling groups_inventory to load vars for managed_node3 13273 1726853282.12184: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853282.12196: Calling all_plugins_play to load vars for managed_node3 13273 1726853282.12207: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853282.12210: Calling groups_plugins_play to load vars for managed_node3 13273 1726853282.12248: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853282.12312: done with get_vars() 13273 1726853282.12318: done getting variables 13273 1726853282.12390: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_removal_nm.yml:6 Friday 20 September 2024 13:28:02 -0400 (0:00:00.013) 0:00:00.013 ****** 13273 1726853282.12412: entering _queue_task() for managed_node3/gather_facts 13273 1726853282.12413: Creating lock for gather_facts 13273 1726853282.12956: worker is 1 (out of 1 available) 13273 1726853282.12965: exiting _queue_task() for managed_node3/gather_facts 13273 1726853282.12981: done queuing things up, now waiting for results queue to drain 13273 1726853282.12982: waiting for pending results... 13273 1726853282.13301: running TaskExecutor() for managed_node3/TASK: Gathering Facts 13273 1726853282.13307: in run() - task 02083763-bbaf-5fc3-657d-0000000001bc 13273 1726853282.13309: variable 'ansible_search_path' from source: unknown 13273 1726853282.13354: calling self._execute() 13273 1726853282.13579: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853282.13582: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853282.13585: variable 'omit' from source: magic vars 13273 1726853282.13587: variable 'omit' from source: magic vars 13273 1726853282.13590: variable 'omit' from source: magic vars 13273 1726853282.13712: variable 'omit' from source: magic vars 13273 1726853282.13764: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13273 1726853282.13919: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13273 1726853282.13946: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13273 1726853282.13965: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853282.14031: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853282.14230: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13273 1726853282.14234: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853282.14236: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853282.14298: Set connection var ansible_connection to ssh 13273 1726853282.14477: Set connection var ansible_module_compression to ZIP_DEFLATED 13273 1726853282.14584: Set connection var ansible_shell_executable to /bin/sh 13273 1726853282.14587: Set connection var ansible_shell_type to sh 13273 1726853282.14590: Set connection var ansible_pipelining to False 13273 1726853282.14592: Set connection var ansible_timeout to 10 13273 1726853282.14594: variable 'ansible_shell_executable' from source: unknown 13273 1726853282.14596: variable 'ansible_connection' from source: unknown 13273 1726853282.14597: variable 'ansible_module_compression' from source: unknown 13273 1726853282.14599: variable 'ansible_shell_type' from source: unknown 13273 1726853282.14601: variable 'ansible_shell_executable' from source: unknown 13273 1726853282.14603: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853282.14605: variable 'ansible_pipelining' from source: unknown 13273 1726853282.14606: variable 'ansible_timeout' from source: unknown 13273 1726853282.14608: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853282.15015: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13273 1726853282.15033: variable 'omit' from source: magic vars 13273 1726853282.15049: starting attempt loop 13273 1726853282.15166: running the handler 13273 1726853282.15170: variable 'ansible_facts' from source: unknown 13273 1726853282.15174: _low_level_execute_command(): starting 13273 1726853282.15179: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13273 1726853282.16851: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13273 1726853282.16905: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853282.16948: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853282.16977: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13273 1726853282.16997: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 13273 1726853282.17164: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853282.17309: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853282.17404: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853282.19123: stdout chunk (state=3): >>>/root <<< 13273 1726853282.19269: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853282.19277: stdout chunk (state=3): >>><<< 13273 1726853282.19280: stderr chunk (state=3): >>><<< 13273 1726853282.19299: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853282.19377: _low_level_execute_command(): starting 13273 1726853282.19383: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853282.1935554-13333-24327516684568 `" && echo ansible-tmp-1726853282.1935554-13333-24327516684568="` echo /root/.ansible/tmp/ansible-tmp-1726853282.1935554-13333-24327516684568 `" ) && sleep 0' 13273 1726853282.20476: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853282.20480: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853282.20790: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853282.20836: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853282.20974: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853282.23156: stdout chunk (state=3): >>>ansible-tmp-1726853282.1935554-13333-24327516684568=/root/.ansible/tmp/ansible-tmp-1726853282.1935554-13333-24327516684568 <<< 13273 1726853282.23249: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853282.23297: stderr chunk (state=3): >>><<< 13273 1726853282.23301: stdout chunk (state=3): >>><<< 13273 1726853282.23316: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853282.1935554-13333-24327516684568=/root/.ansible/tmp/ansible-tmp-1726853282.1935554-13333-24327516684568 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853282.23355: variable 'ansible_module_compression' from source: unknown 13273 1726853282.23430: ANSIBALLZ: Using generic lock for ansible.legacy.setup 13273 1726853282.23677: ANSIBALLZ: Acquiring lock 13273 1726853282.23680: ANSIBALLZ: Lock acquired: 140136094830320 13273 1726853282.23682: ANSIBALLZ: Creating module 13273 1726853282.87887: ANSIBALLZ: Writing module into payload 13273 1726853282.88425: ANSIBALLZ: Writing module 13273 1726853282.88501: ANSIBALLZ: Renaming module 13273 1726853282.88978: ANSIBALLZ: Done creating module 13273 1726853282.88981: variable 'ansible_facts' from source: unknown 13273 1726853282.88983: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13273 1726853282.88985: _low_level_execute_command(): starting 13273 1726853282.88987: _low_level_execute_command(): executing: /bin/sh -c 'echo PLATFORM; uname; echo FOUND; command -v '"'"'python3.12'"'"'; command -v '"'"'python3.11'"'"'; command -v '"'"'python3.10'"'"'; command -v '"'"'python3.9'"'"'; command -v '"'"'python3.8'"'"'; command -v '"'"'python3.7'"'"'; command -v '"'"'/usr/bin/python3'"'"'; command -v '"'"'python3'"'"'; echo ENDFOUND && sleep 0' 13273 1726853282.90536: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853282.90595: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853282.90618: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853282.90636: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853282.90852: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853282.93095: stdout chunk (state=3): >>>PLATFORM <<< 13273 1726853282.93199: stdout chunk (state=3): >>>Linux <<< 13273 1726853282.93228: stdout chunk (state=3): >>>FOUND <<< 13273 1726853282.93276: stdout chunk (state=3): >>>/usr/bin/python3.12 <<< 13273 1726853282.93288: stdout chunk (state=3): >>>/usr/bin/python3 /usr/bin/python3 ENDFOUND <<< 13273 1726853282.93450: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853282.93492: stderr chunk (state=3): >>><<< 13273 1726853282.93587: stdout chunk (state=3): >>><<< 13273 1726853282.93680: _low_level_execute_command() done: rc=0, stdout=PLATFORM Linux FOUND /usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 ENDFOUND , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853282.93685 [managed_node3]: found interpreters: ['/usr/bin/python3.12', '/usr/bin/python3', '/usr/bin/python3'] 13273 1726853282.93689: _low_level_execute_command(): starting 13273 1726853282.93691: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 && sleep 0' 13273 1726853282.94165: Sending initial data 13273 1726853282.94184: Sent initial data (1181 bytes) 13273 1726853282.95151: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853282.95387: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853282.95408: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853282.95662: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 13273 1726853283.00858: stdout chunk (state=3): >>>{"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"10 (Coughlan)\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"10\"\nPLATFORM_ID=\"platform:el10\"\nPRETTY_NAME=\"CentOS Stream 10 (Coughlan)\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:10\"\nHOME_URL=\"https://centos.org/\"\nVENDOR_NAME=\"CentOS\"\nVENDOR_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 10\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} <<< 13273 1726853283.01518: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853283.01522: stdout chunk (state=3): >>><<< 13273 1726853283.01525: stderr chunk (state=3): >>><<< 13273 1726853283.01534: _low_level_execute_command() done: rc=0, stdout={"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"10 (Coughlan)\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"10\"\nPLATFORM_ID=\"platform:el10\"\nPRETTY_NAME=\"CentOS Stream 10 (Coughlan)\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:10\"\nHOME_URL=\"https://centos.org/\"\nVENDOR_NAME=\"CentOS\"\nVENDOR_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 10\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 13273 1726853283.01710: variable 'ansible_facts' from source: unknown 13273 1726853283.01737: variable 'ansible_facts' from source: unknown 13273 1726853283.01839: variable 'ansible_module_compression' from source: unknown 13273 1726853283.01862: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13273pukop8ph/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 13273 1726853283.01952: variable 'ansible_facts' from source: unknown 13273 1726853283.02390: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853282.1935554-13333-24327516684568/AnsiballZ_setup.py 13273 1726853283.02804: Sending initial data 13273 1726853283.02808: Sent initial data (153 bytes) 13273 1726853283.04010: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853283.04165: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853283.04235: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 13273 1726853283.06343: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13273 1726853283.06538: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13273 1726853283.06621: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13273pukop8ph/tmp5w22g6gp /root/.ansible/tmp/ansible-tmp-1726853282.1935554-13333-24327516684568/AnsiballZ_setup.py <<< 13273 1726853283.06627: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853282.1935554-13333-24327516684568/AnsiballZ_setup.py" <<< 13273 1726853283.06974: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13273pukop8ph/tmp5w22g6gp" to remote "/root/.ansible/tmp/ansible-tmp-1726853282.1935554-13333-24327516684568/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853282.1935554-13333-24327516684568/AnsiballZ_setup.py" <<< 13273 1726853283.10415: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853283.10505: stderr chunk (state=3): >>><<< 13273 1726853283.10509: stdout chunk (state=3): >>><<< 13273 1726853283.10613: done transferring module to remote 13273 1726853283.10617: _low_level_execute_command(): starting 13273 1726853283.10620: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853282.1935554-13333-24327516684568/ /root/.ansible/tmp/ansible-tmp-1726853282.1935554-13333-24327516684568/AnsiballZ_setup.py && sleep 0' 13273 1726853283.11917: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853283.11933: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853283.12101: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853283.12109: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853283.12112: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853283.12239: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 13273 1726853283.14723: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853283.14815: stderr chunk (state=3): >>><<< 13273 1726853283.14818: stdout chunk (state=3): >>><<< 13273 1726853283.14820: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 13273 1726853283.14823: _low_level_execute_command(): starting 13273 1726853283.14825: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853282.1935554-13333-24327516684568/AnsiballZ_setup.py && sleep 0' 13273 1726853283.15974: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13273 1726853283.16088: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853283.16191: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853283.16201: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853283.16208: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853283.16600: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 13273 1726853283.19821: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # <<< 13273 1726853283.19951: stdout chunk (state=3): >>>import '_io' # <<< 13273 1726853283.19955: stdout chunk (state=3): >>>import 'marshal' # <<< 13273 1726853283.19957: stdout chunk (state=3): >>>import 'posix' # <<< 13273 1726853283.19992: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook import 'time' # <<< 13273 1726853283.20011: stdout chunk (state=3): >>>import 'zipimport' # # installed zipimport hook <<< 13273 1726853283.20094: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # <<< 13273 1726853283.20115: stdout chunk (state=3): >>>import 'codecs' # <<< 13273 1726853283.20155: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 13273 1726853283.20188: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4778bbc4d0> <<< 13273 1726853283.20227: stdout chunk (state=3): >>>import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4778b8bb00> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' <<< 13273 1726853283.20279: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4778bbea50> import '_signal' # <<< 13273 1726853283.20285: stdout chunk (state=3): >>>import '_abc' # import 'abc' # <<< 13273 1726853283.20355: stdout chunk (state=3): >>>import 'io' # import '_stat' # import 'stat' # <<< 13273 1726853283.20478: stdout chunk (state=3): >>>import '_collections_abc' # <<< 13273 1726853283.20605: stdout chunk (state=3): >>>import 'genericpath' # import 'posixpath' # <<< 13273 1726853283.20655: stdout chunk (state=3): >>>import 'os' # import '_sitebuiltins' # Processing user site-packages <<< 13273 1726853283.20674: stdout chunk (state=3): >>>Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4778bcd130> <<< 13273 1726853283.20727: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py <<< 13273 1726853283.20749: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4778bcdfa0> <<< 13273 1726853283.20822: stdout chunk (state=3): >>>import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 13273 1726853283.21476: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 13273 1726853283.21495: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 13273 1726853283.21556: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 13273 1726853283.21614: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' <<< 13273 1726853283.21775: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f47789ebda0> <<< 13273 1726853283.21778: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py <<< 13273 1726853283.21780: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f47789ebfe0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 13273 1726853283.21838: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 13273 1726853283.21863: stdout chunk (state=3): >>>import 'itertools' # <<< 13273 1726853283.21883: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py <<< 13273 1726853283.21904: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4778a237a0> <<< 13273 1726853283.21926: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' <<< 13273 1726853283.21966: stdout chunk (state=3): >>>import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4778a23e30> import '_collections' # <<< 13273 1726853283.22027: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4778a03a70> import '_functools' # <<< 13273 1726853283.22089: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4778a01190> <<< 13273 1726853283.22193: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f47789e8f50> <<< 13273 1726853283.22223: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 13273 1726853283.22252: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # <<< 13273 1726853283.22331: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 13273 1726853283.22339: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py <<< 13273 1726853283.22346: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 13273 1726853283.22440: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4778a43710> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4778a42330> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4778a02060> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f47789ea810> <<< 13273 1726853283.22537: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4778a787a0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f47789e81d0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 13273 1726853283.22649: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4778a78c50> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4778a78b00> <<< 13273 1726853283.22662: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4778a78ec0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f47789e6cf0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' <<< 13273 1726853283.22708: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 13273 1726853283.22715: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' <<< 13273 1726853283.22948: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4778a795b0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4778a79280> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4778a7a4b0> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4778a906e0> <<< 13273 1726853283.23076: stdout chunk (state=3): >>>import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4778a91df0> <<< 13273 1726853283.23079: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' <<< 13273 1726853283.23081: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py <<< 13273 1726853283.23083: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4778a92c60> <<< 13273 1726853283.23174: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4778a932c0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4778a921b0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4778a93d40> <<< 13273 1726853283.23216: stdout chunk (state=3): >>>import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4778a93470> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4778a7a510> <<< 13273 1726853283.23240: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 13273 1726853283.23267: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 13273 1726853283.23293: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 13273 1726853283.23397: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4778787b90> <<< 13273 1726853283.23419: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f47787b0620> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f47787b0380> <<< 13273 1726853283.23480: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f47787b0650> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py <<< 13273 1726853283.23498: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 13273 1726853283.23570: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 13273 1726853283.23745: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f47787b0f80> <<< 13273 1726853283.23988: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f47787b18b0> <<< 13273 1726853283.24010: stdout chunk (state=3): >>>import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f47787b0830> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4778785d30> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 13273 1726853283.24039: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py <<< 13273 1726853283.24047: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' <<< 13273 1726853283.24158: stdout chunk (state=3): >>>import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f47787b2c00> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f47787b0e00> <<< 13273 1726853283.24177: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4778a7ac00> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 13273 1726853283.24225: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 13273 1726853283.24705: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f47787def60> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4778803320> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f477882ff80> <<< 13273 1726853283.24707: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 13273 1726853283.24743: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 13273 1726853283.24767: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 13273 1726853283.24829: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 13273 1726853283.24950: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f47788627e0> <<< 13273 1726853283.25056: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f47788601a0> <<< 13273 1726853283.25137: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4778803fb0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f47781291c0> <<< 13273 1726853283.25392: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4778802120> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f47787b3b60> <<< 13273 1726853283.25422: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 13273 1726853283.25440: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f4778802240> <<< 13273 1726853283.26011: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_o6fk94gt/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available # zipimport: zlib available <<< 13273 1726853283.26053: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py <<< 13273 1726853283.26056: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 13273 1726853283.26111: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 13273 1726853283.26222: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 13273 1726853283.26257: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f477818ae40> import '_typing' # <<< 13273 1726853283.26552: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4778169d30> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4778168ef0> <<< 13273 1726853283.26572: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853283.26606: stdout chunk (state=3): >>>import 'ansible' # <<< 13273 1726853283.26628: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 13273 1726853283.26651: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils' # <<< 13273 1726853283.26687: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853283.29029: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853283.30814: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4778188ce0> <<< 13273 1726853283.30862: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py <<< 13273 1726853283.30892: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 13273 1726853283.30918: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f47781c2870> <<< 13273 1726853283.31013: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f47781c2600> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f47781c1f10> <<< 13273 1726853283.31040: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 13273 1726853283.31079: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f47781c2660> <<< 13273 1726853283.31115: stdout chunk (state=3): >>>import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f477818b860> import 'atexit' # <<< 13273 1726853283.31190: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f47781c35c0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f47781c3800> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 13273 1726853283.31263: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 13273 1726853283.31288: stdout chunk (state=3): >>>import '_locale' # <<< 13273 1726853283.31483: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f47781c3d40> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 13273 1726853283.31498: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4778025a00> <<< 13273 1726853283.31898: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' <<< 13273 1726853283.31902: stdout chunk (state=3): >>># extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4778027680> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4778027f80> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f477802cf80> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 13273 1726853283.31999: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f477802fce0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f47787deed0> <<< 13273 1726853283.32033: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f477802dfa0> <<< 13273 1726853283.32046: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 13273 1726853283.32092: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 13273 1726853283.32145: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' <<< 13273 1726853283.32345: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 13273 1726853283.32376: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 13273 1726853283.32463: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py <<< 13273 1726853283.32486: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4778033c20> import '_tokenize' # <<< 13273 1726853283.32591: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4778032720> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4778032480> <<< 13273 1726853283.32633: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py <<< 13273 1726853283.32652: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 13273 1726853283.32775: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f47780329c0> <<< 13273 1726853283.32829: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f477802e4b0> <<< 13273 1726853283.32875: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' <<< 13273 1726853283.32972: stdout chunk (state=3): >>># extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4778077e00><<< 13273 1726853283.33011: stdout chunk (state=3): >>> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' <<< 13273 1726853283.33037: stdout chunk (state=3): >>>import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4778077800> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc'<<< 13273 1726853283.33053: stdout chunk (state=3): >>> <<< 13273 1726853283.33063: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py<<< 13273 1726853283.33217: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4778079a30> <<< 13273 1726853283.33220: stdout chunk (state=3): >>>import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f47780797f0> <<< 13273 1726853283.33247: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py<<< 13273 1726853283.33259: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 13273 1726853283.33461: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' <<< 13273 1726853283.33464: stdout chunk (state=3): >>>import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f477807bfb0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f477807a120> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 13273 1726853283.33487: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 13273 1726853283.33503: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py<<< 13273 1726853283.33530: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' <<< 13273 1726853283.33623: stdout chunk (state=3): >>>import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f477807f5f0> <<< 13273 1726853283.33827: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f477807bf20> <<< 13273 1726853283.33998: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4778080740> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so'<<< 13273 1726853283.34063: stdout chunk (state=3): >>> import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f47780808c0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so'<<< 13273 1726853283.34085: stdout chunk (state=3): >>> # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4778080950> <<< 13273 1726853283.34309: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4778078170> <<< 13273 1726853283.34331: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4778083fe0><<< 13273 1726853283.34490: stdout chunk (state=3): >>> <<< 13273 1726853283.34606: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 13273 1726853283.34643: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4777f0d250> <<< 13273 1726853283.34655: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f47780827e0> <<< 13273 1726853283.34750: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4778083b90> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4778082420> <<< 13273 1726853283.34785: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853283.34813: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.compat' # <<< 13273 1726853283.34853: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853283.34995: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853283.35147: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853283.35174: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853283.35203: stdout chunk (state=3): >>>import 'ansible.module_utils.common' # # zipimport: zlib available <<< 13273 1726853283.35226: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.text' # <<< 13273 1726853283.35259: stdout chunk (state=3): >>> # zipimport: zlib available <<< 13273 1726853283.35497: stdout chunk (state=3): >>># zipimport: zlib available<<< 13273 1726853283.35593: stdout chunk (state=3): >>> <<< 13273 1726853283.35641: stdout chunk (state=3): >>># zipimport: zlib available<<< 13273 1726853283.35724: stdout chunk (state=3): >>> <<< 13273 1726853283.36584: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853283.37586: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # <<< 13273 1726853283.37628: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py <<< 13273 1726853283.37643: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 13273 1726853283.37731: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4777f11460><<< 13273 1726853283.37949: stdout chunk (state=3): >>> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4777f12240> <<< 13273 1726853283.37953: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f47780804a0><<< 13273 1726853283.37955: stdout chunk (state=3): >>> <<< 13273 1726853283.38025: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # <<< 13273 1726853283.38048: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853283.38078: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853283.38112: stdout chunk (state=3): >>>import 'ansible.module_utils._text' # <<< 13273 1726853283.38149: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853283.38378: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853283.38617: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py<<< 13273 1726853283.38680: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc'<<< 13273 1726853283.38693: stdout chunk (state=3): >>> import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4777f122d0> <<< 13273 1726853283.38898: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853283.39453: stdout chunk (state=3): >>># zipimport: zlib available<<< 13273 1726853283.39688: stdout chunk (state=3): >>> <<< 13273 1726853283.40219: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853283.40363: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853283.40470: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 13273 1726853283.40495: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853283.40554: stdout chunk (state=3): >>># zipimport: zlib available<<< 13273 1726853283.40564: stdout chunk (state=3): >>> <<< 13273 1726853283.40797: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available <<< 13273 1726853283.40867: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 13273 1726853283.40911: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853283.40940: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.parsing' # <<< 13273 1726853283.40968: stdout chunk (state=3): >>># zipimport: zlib available<<< 13273 1726853283.40984: stdout chunk (state=3): >>> <<< 13273 1726853283.41079: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853283.41121: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 13273 1726853283.41133: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853283.41508: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853283.41895: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 13273 1726853283.41974: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc'<<< 13273 1726853283.42006: stdout chunk (state=3): >>> import '_ast' # <<< 13273 1726853283.42118: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4777f13380><<< 13273 1726853283.42140: stdout chunk (state=3): >>> # zipimport: zlib available <<< 13273 1726853283.42288: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853283.42373: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # <<< 13273 1726853283.42441: stdout chunk (state=3): >>>import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # <<< 13273 1726853283.42473: stdout chunk (state=3): >>>import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available <<< 13273 1726853283.42506: stdout chunk (state=3): >>># zipimport: zlib available<<< 13273 1726853283.42585: stdout chunk (state=3): >>> import 'ansible.module_utils.common.locale' # <<< 13273 1726853283.42684: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853283.42688: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853283.42759: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853283.42822: stdout chunk (state=3): >>># zipimport: zlib available<<< 13273 1726853283.42926: stdout chunk (state=3): >>> # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 13273 1726853283.43026: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 13273 1726853283.43282: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' <<< 13273 1726853283.43286: stdout chunk (state=3): >>>import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4777f1dd00> <<< 13273 1726853283.43300: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4777f18d70> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # <<< 13273 1726853283.43497: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 13273 1726853283.43529: stdout chunk (state=3): >>># zipimport: zlib available<<< 13273 1726853283.43710: stdout chunk (state=3): >>> # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 13273 1726853283.43713: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 13273 1726853283.43810: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc'<<< 13273 1726853283.43851: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 13273 1726853283.44008: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4778006690> <<< 13273 1726853283.44039: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f47780fe360><<< 13273 1726853283.44185: stdout chunk (state=3): >>> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4777f1dcd0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4777f132f0><<< 13273 1726853283.44203: stdout chunk (state=3): >>> # destroy ansible.module_utils.distro<<< 13273 1726853283.44234: stdout chunk (state=3): >>> import 'ansible.module_utils.distro' # <<< 13273 1726853283.44252: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853283.44359: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 13273 1726853283.44362: stdout chunk (state=3): >>> <<< 13273 1726853283.44475: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # <<< 13273 1726853283.44493: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 13273 1726853283.44544: stdout chunk (state=3): >>>import 'ansible.modules' # <<< 13273 1726853283.44661: stdout chunk (state=3): >>> # zipimport: zlib available <<< 13273 1726853283.44687: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853283.44723: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853283.44754: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853283.44894: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 13273 1726853283.45067: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853283.45070: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.namespace' # <<< 13273 1726853283.45075: stdout chunk (state=3): >>># zipimport: zlib available<<< 13273 1726853283.45077: stdout chunk (state=3): >>> <<< 13273 1726853283.45195: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853283.45327: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853283.45339: stdout chunk (state=3): >>># zipimport: zlib available<<< 13273 1726853283.45455: stdout chunk (state=3): >>> import 'ansible.module_utils.compat.typing' # # zipimport: zlib available <<< 13273 1726853283.45803: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853283.46140: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 13273 1726853283.46261: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py<<< 13273 1726853283.46370: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4777fb1af0> <<< 13273 1726853283.46380: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py <<< 13273 1726853283.46430: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc'<<< 13273 1726853283.46433: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py <<< 13273 1726853283.46558: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc'<<< 13273 1726853283.46574: stdout chunk (state=3): >>> <<< 13273 1726853283.46678: stdout chunk (state=3): >>>import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4777bafad0> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' <<< 13273 1726853283.46681: stdout chunk (state=3): >>># extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4777bafe30> <<< 13273 1726853283.46761: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4777fb3050><<< 13273 1726853283.46796: stdout chunk (state=3): >>> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4777fb2630> <<< 13273 1726853283.46840: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4777fb01d0><<< 13273 1726853283.46893: stdout chunk (state=3): >>> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4777fb0bc0><<< 13273 1726853283.46898: stdout chunk (state=3): >>> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py <<< 13273 1726853283.46980: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc'<<< 13273 1726853283.46998: stdout chunk (state=3): >>> <<< 13273 1726853283.47026: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' <<< 13273 1726853283.47064: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc'<<< 13273 1726853283.47119: stdout chunk (state=3): >>> # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' <<< 13273 1726853283.47136: stdout chunk (state=3): >>># extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4777bc6ed0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4777bc6780><<< 13273 1726853283.47209: stdout chunk (state=3): >>> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4777bc6960> <<< 13273 1726853283.47261: stdout chunk (state=3): >>>import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4777bc5bb0> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 13273 1726853283.47474: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' <<< 13273 1726853283.47481: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4777bc70b0><<< 13273 1726853283.47521: stdout chunk (state=3): >>> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py <<< 13273 1726853283.47574: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' <<< 13273 1726853283.47620: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so'<<< 13273 1726853283.47668: stdout chunk (state=3): >>> import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4777c1dbe0> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4777bc7bc0> <<< 13273 1726853283.47736: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4777fb03e0> <<< 13273 1726853283.47739: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.timeout' # <<< 13273 1726853283.47812: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.collector' # # zipimport: zlib available <<< 13273 1726853283.47817: stdout chunk (state=3): >>># zipimport: zlib available<<< 13273 1726853283.47848: stdout chunk (state=3): >>> import 'ansible.module_utils.facts.other' # <<< 13273 1726853283.47863: stdout chunk (state=3): >>> # zipimport: zlib available <<< 13273 1726853283.48026: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # <<< 13273 1726853283.48061: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853283.48134: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853283.48218: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.ohai' # <<< 13273 1726853283.48270: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 13273 1726853283.48285: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system' # <<< 13273 1726853283.48297: stdout chunk (state=3): >>># zipimport: zlib available<<< 13273 1726853283.48377: stdout chunk (state=3): >>> # zipimport: zlib available <<< 13273 1726853283.48406: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available <<< 13273 1726853283.48692: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available <<< 13273 1726853283.48715: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available <<< 13273 1726853283.48744: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853283.48800: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853283.48858: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853283.48963: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available <<< 13273 1726853283.49696: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853283.49914: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available <<< 13273 1726853283.49941: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853283.50087: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available <<< 13273 1726853283.50142: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853283.50145: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.env' # <<< 13273 1726853283.50161: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853283.50342: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853283.50360: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # <<< 13273 1726853283.50557: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available <<< 13273 1726853283.50570: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853283.50597: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py <<< 13273 1726853283.50795: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4777c1e5a0> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 13273 1726853283.50909: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4777c1e780> import 'ansible.module_utils.facts.system.local' # <<< 13273 1726853283.50984: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 13273 1726853283.51070: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # <<< 13273 1726853283.51130: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853283.51221: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853283.51358: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available <<< 13273 1726853283.51464: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853283.51569: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # <<< 13273 1726853283.51583: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853283.51629: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853283.51701: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 13273 1726853283.51770: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 13273 1726853283.51872: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 13273 1726853283.51980: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4777c5de80> <<< 13273 1726853283.52368: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f477816b6e0> import 'ansible.module_utils.facts.system.python' # <<< 13273 1726853283.52455: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853283.52480: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853283.52706: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available <<< 13273 1726853283.52902: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853283.52986: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853283.53296: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available <<< 13273 1726853283.53344: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853283.53502: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4777c65a30> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4777c42ea0> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available <<< 13273 1726853283.53518: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware' # <<< 13273 1726853283.53530: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853283.53608: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853283.53620: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.base' # <<< 13273 1726853283.53794: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 13273 1726853283.54006: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available <<< 13273 1726853283.54182: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853283.54195: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853283.54279: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853283.54306: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available <<< 13273 1726853283.54395: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853283.54477: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853283.54641: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # <<< 13273 1726853283.54647: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853283.54747: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853283.54877: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available <<< 13273 1726853283.54940: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853283.54946: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853283.55705: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853283.56450: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available <<< 13273 1726853283.56520: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853283.56702: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # <<< 13273 1726853283.56999: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853283.57014: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available <<< 13273 1726853283.57221: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853283.57484: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available <<< 13273 1726853283.57512: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network' # <<< 13273 1726853283.57554: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853283.57579: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853283.57640: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available <<< 13273 1726853283.57913: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853283.58033: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853283.58290: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853283.58503: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # <<< 13273 1726853283.58519: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853283.58564: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853283.58596: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.darwin' # <<< 13273 1726853283.58621: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 13273 1726853283.58654: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.dragonfly' # <<< 13273 1726853283.58885: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 13273 1726853283.58888: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available <<< 13273 1726853283.58932: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853283.59185: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available <<< 13273 1726853283.59405: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853283.59684: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available <<< 13273 1726853283.59727: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853283.59786: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.iscsi' # <<< 13273 1726853283.59798: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853283.59991: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available <<< 13273 1726853283.60017: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.openbsd' # <<< 13273 1726853283.60034: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853283.60108: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853283.60187: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # <<< 13273 1726853283.60205: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853283.60286: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available <<< 13273 1726853283.60491: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 13273 1726853283.60538: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853283.60612: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # <<< 13273 1726853283.60632: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.dragonfly' # <<< 13273 1726853283.60789: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available <<< 13273 1726853283.60960: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853283.61207: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available <<< 13273 1726853283.61210: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853283.61387: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available <<< 13273 1726853283.61446: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853283.61541: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available <<< 13273 1726853283.61642: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853283.61726: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 13273 1726853283.61847: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853283.63021: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' <<< 13273 1726853283.63065: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' <<< 13273 1726853283.63094: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' <<< 13273 1726853283.63111: stdout chunk (state=3): >>># extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4777a62300> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4777a61970> <<< 13273 1726853283.63212: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4777a59be0> <<< 13273 1726853283.77006: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py <<< 13273 1726853283.77104: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4777aaacf0> # /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py <<< 13273 1726853283.77130: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' <<< 13273 1726853283.77156: stdout chunk (state=3): >>>import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4777aa8dd0> <<< 13273 1726853283.77290: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4777aab110> <<< 13273 1726853283.77329: stdout chunk (state=3): >>>import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4777aa9df0> <<< 13273 1726853283.77610: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame<<< 13273 1726853283.77640: stdout chunk (state=3): >>> PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame <<< 13273 1726853284.04408: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCoYDjPBT+oZMoH+Iz89VMad3pzgkIKqOuUO8QZPyEpVfgGNjVOaglgkXLQyOXulclB6EA4nlBVmVP6IyWL+N4gskjf5Qmm5n+WHu3amXXm9l66v+yKWruV18sgIn8o6iAdCgZrJFzdFVNeKRLYV6P67syyegqRkOJe7U2m/rxA967Vm6wcrwJN8eiZc8tbx1lHOuJNCcP20ThNxMHIPfvT2im8rlt/ojf6e+Z1axRnvFBubBg1qDfxHEX6AlxMHl+CIOXxGHxsvSxfLq7lhrXrComBSxTDv+fmHeJkck3VGck2rn8Hz3eBTty453RU3pq6KxdqU1GB+Z+VYHInXEtXb2i38QpLqRfiLqwUxdjkFKqz0j5z/+NskQF3w8j6uz77Revs9G8eMs14sICJtnD9qBLhFuGxlR/ovZCjynVjBTKBfDaVASpjV0iCIZKSgLn6zSaM/6FBhnBoeb4ch2iiw2S8Q0aKJNKIp0AfY21IVplyS3VCu+Br3fIXzoINo0s=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBIX3PaAHBZzPq23oQJkywX/2bB39yz9ZrSLgsNfhL04NHwnY0Up/oiN+aiUte1DWFqV5wiDLJpl9a1tDLARWXNA=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIBb2CS4KVp6KVVqnGA45j7tkSkijXfGxbd3neKpDsjh9", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_fips": false, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_lsb": {}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-11-217.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-11-217", "ansible_nodename": "ip-10-31-11-217.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2a85cf21f17783c9da20681cb8e352", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_local": {}, "ansible_is_chroot": false, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_apparmor": {"status": "disabled"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2981, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 550, "free": 2981}, "nocache": {"free": 3296, "used": 235}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2a85cf-21f1-7783-c9da-20681cb8e352", "ansible_product_uuid": "ec2a85cf-21f1-7783-c9da-20681cb8e352", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 427, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261805453312, "block_size": 4096, "block_total": 65519099, "block_available": 63917347, "block_used": 1601752, "inode_total": 131070960, "inode_available": 131029150, "inode_used": 41810, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_fibre_channel_wwn": [], "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "12:2a:53:36:f0:e9", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.11.217", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::102a:53ff:fe36:f0e9", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.11.217", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:2a:53:36:f0:e9", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.11.217"], "ansible_all_ipv6_addresses": ["fe80::102a:53ff:fe36:f0e9"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.11.217", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::102a:53ff:fe36:f0e9"]}, "ansible_loadavg": {"1m": 0.6416015625, "5m": 0.36767578125, "15m": 0.17236328125}, "ansible_service_mgr": "systemd", "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/1", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.46.199 55604 10.31.11.217 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.46.199 55604 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1"}, "ansible_pkg_mgr": "dnf", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "28", "second": "04", "epoch": "1726853284", "epoch_int": "1726853284", "date": "2024-09-20", "time": "13:28:04", "iso8601_micro": "2024-09-20T17:28:04.037309Z", "iso8601": "2024-09-20T17:28:04Z", "iso8601_basic": "20240920T132804037309", "iso8601_basic_short": "20240920T132804", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_iscsi_iqn": "", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 13273 1726853284.04706: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv <<< 13273 1726853284.04974: stdout chunk (state=3): >>># clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos <<< 13273 1726853284.04994: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy <<< 13273 1726853284.05486: stdout chunk (state=3): >>># destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util <<< 13273 1726853284.05556: stdout chunk (state=3): >>># destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib <<< 13273 1726853284.05560: stdout chunk (state=3): >>># destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner <<< 13273 1726853284.05562: stdout chunk (state=3): >>># destroy _json # destroy grp # destroy encodings <<< 13273 1726853284.05568: stdout chunk (state=3): >>># destroy _locale<<< 13273 1726853284.05582: stdout chunk (state=3): >>> <<< 13273 1726853284.05584: stdout chunk (state=3): >>># destroy locale # destroy select <<< 13273 1726853284.05704: stdout chunk (state=3): >>># destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal <<< 13273 1726853284.05807: stdout chunk (state=3): >>># destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 <<< 13273 1726853284.05840: stdout chunk (state=3): >>># destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios <<< 13273 1726853284.05847: stdout chunk (state=3): >>># destroy json <<< 13273 1726853284.05880: stdout chunk (state=3): >>># destroy socket # destroy struct # destroy glob <<< 13273 1726853284.05934: stdout chunk (state=3): >>># destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection <<< 13273 1726853284.05938: stdout chunk (state=3): >>># destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing <<< 13273 1726853284.06578: stdout chunk (state=3): >>># destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal <<< 13273 1726853284.06600: stdout chunk (state=3): >>># clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 13273 1726853284.06748: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading <<< 13273 1726853284.06753: stdout chunk (state=3): >>># destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time <<< 13273 1726853284.06818: stdout chunk (state=3): >>># destroy _random # destroy _weakref <<< 13273 1726853284.06841: stdout chunk (state=3): >>># destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re <<< 13273 1726853284.06847: stdout chunk (state=3): >>># destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread <<< 13273 1726853284.06947: stdout chunk (state=3): >>># clear sys.audit hooks <<< 13273 1726853284.07548: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 13273 1726853284.07551: stdout chunk (state=3): >>><<< 13273 1726853284.07577: stderr chunk (state=3): >>><<< 13273 1726853284.07892: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4778bbc4d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4778b8bb00> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4778bbea50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4778bcd130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4778bcdfa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f47789ebda0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f47789ebfe0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4778a237a0> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4778a23e30> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4778a03a70> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4778a01190> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f47789e8f50> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4778a43710> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4778a42330> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4778a02060> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f47789ea810> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4778a787a0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f47789e81d0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4778a78c50> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4778a78b00> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4778a78ec0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f47789e6cf0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4778a795b0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4778a79280> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4778a7a4b0> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4778a906e0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4778a91df0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4778a92c60> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4778a932c0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4778a921b0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4778a93d40> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4778a93470> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4778a7a510> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4778787b90> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f47787b0620> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f47787b0380> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f47787b0650> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f47787b0f80> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f47787b18b0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f47787b0830> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4778785d30> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f47787b2c00> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f47787b0e00> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4778a7ac00> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f47787def60> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4778803320> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f477882ff80> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f47788627e0> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f47788601a0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4778803fb0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f47781291c0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4778802120> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f47787b3b60> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f4778802240> # zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_o6fk94gt/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f477818ae40> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4778169d30> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4778168ef0> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4778188ce0> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f47781c2870> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f47781c2600> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f47781c1f10> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f47781c2660> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f477818b860> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f47781c35c0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f47781c3800> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f47781c3d40> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4778025a00> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4778027680> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4778027f80> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f477802cf80> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f477802fce0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f47787deed0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f477802dfa0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4778033c20> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4778032720> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4778032480> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f47780329c0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f477802e4b0> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4778077e00> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4778077800> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4778079a30> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f47780797f0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f477807bfb0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f477807a120> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f477807f5f0> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f477807bf20> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4778080740> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f47780808c0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4778080950> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4778078170> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4778083fe0> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4777f0d250> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f47780827e0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4778083b90> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4778082420> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4777f11460> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4777f12240> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f47780804a0> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4777f122d0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4777f13380> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4777f1dd00> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4777f18d70> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4778006690> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f47780fe360> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4777f1dcd0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4777f132f0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4777fb1af0> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4777bafad0> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4777bafe30> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4777fb3050> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4777fb2630> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4777fb01d0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4777fb0bc0> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4777bc6ed0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4777bc6780> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4777bc6960> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4777bc5bb0> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4777bc70b0> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4777c1dbe0> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4777bc7bc0> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4777fb03e0> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4777c1e5a0> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4777c1e780> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4777c5de80> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f477816b6e0> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4777c65a30> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4777c42ea0> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4777a62300> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4777a61970> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4777a59be0> # /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4777aaacf0> # /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4777aa8dd0> # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4777aab110> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4777aa9df0> PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame {"ansible_facts": {"ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCoYDjPBT+oZMoH+Iz89VMad3pzgkIKqOuUO8QZPyEpVfgGNjVOaglgkXLQyOXulclB6EA4nlBVmVP6IyWL+N4gskjf5Qmm5n+WHu3amXXm9l66v+yKWruV18sgIn8o6iAdCgZrJFzdFVNeKRLYV6P67syyegqRkOJe7U2m/rxA967Vm6wcrwJN8eiZc8tbx1lHOuJNCcP20ThNxMHIPfvT2im8rlt/ojf6e+Z1axRnvFBubBg1qDfxHEX6AlxMHl+CIOXxGHxsvSxfLq7lhrXrComBSxTDv+fmHeJkck3VGck2rn8Hz3eBTty453RU3pq6KxdqU1GB+Z+VYHInXEtXb2i38QpLqRfiLqwUxdjkFKqz0j5z/+NskQF3w8j6uz77Revs9G8eMs14sICJtnD9qBLhFuGxlR/ovZCjynVjBTKBfDaVASpjV0iCIZKSgLn6zSaM/6FBhnBoeb4ch2iiw2S8Q0aKJNKIp0AfY21IVplyS3VCu+Br3fIXzoINo0s=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBIX3PaAHBZzPq23oQJkywX/2bB39yz9ZrSLgsNfhL04NHwnY0Up/oiN+aiUte1DWFqV5wiDLJpl9a1tDLARWXNA=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIBb2CS4KVp6KVVqnGA45j7tkSkijXfGxbd3neKpDsjh9", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_fips": false, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_lsb": {}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-11-217.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-11-217", "ansible_nodename": "ip-10-31-11-217.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2a85cf21f17783c9da20681cb8e352", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_local": {}, "ansible_is_chroot": false, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_apparmor": {"status": "disabled"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2981, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 550, "free": 2981}, "nocache": {"free": 3296, "used": 235}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2a85cf-21f1-7783-c9da-20681cb8e352", "ansible_product_uuid": "ec2a85cf-21f1-7783-c9da-20681cb8e352", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 427, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261805453312, "block_size": 4096, "block_total": 65519099, "block_available": 63917347, "block_used": 1601752, "inode_total": 131070960, "inode_available": 131029150, "inode_used": 41810, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_fibre_channel_wwn": [], "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "12:2a:53:36:f0:e9", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.11.217", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::102a:53ff:fe36:f0e9", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.11.217", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:2a:53:36:f0:e9", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.11.217"], "ansible_all_ipv6_addresses": ["fe80::102a:53ff:fe36:f0e9"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.11.217", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::102a:53ff:fe36:f0e9"]}, "ansible_loadavg": {"1m": 0.6416015625, "5m": 0.36767578125, "15m": 0.17236328125}, "ansible_service_mgr": "systemd", "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/1", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.46.199 55604 10.31.11.217 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.46.199 55604 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1"}, "ansible_pkg_mgr": "dnf", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "28", "second": "04", "epoch": "1726853284", "epoch_int": "1726853284", "date": "2024-09-20", "time": "13:28:04", "iso8601_micro": "2024-09-20T17:28:04.037309Z", "iso8601": "2024-09-20T17:28:04Z", "iso8601_basic": "20240920T132804037309", "iso8601_basic_short": "20240920T132804", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_iscsi_iqn": "", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks [WARNING]: Platform linux on host managed_node3 is using the discovered Python interpreter at /usr/bin/python3.12, but future installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible- core/2.17/reference_appendices/interpreter_discovery.html for more information. 13273 1726853284.09786: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853282.1935554-13333-24327516684568/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13273 1726853284.09790: _low_level_execute_command(): starting 13273 1726853284.09793: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853282.1935554-13333-24327516684568/ > /dev/null 2>&1 && sleep 0' 13273 1726853284.11164: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853284.11167: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853284.11170: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853284.11175: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853284.11523: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853284.11526: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853284.11643: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853284.11738: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853284.13602: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853284.13639: stderr chunk (state=3): >>><<< 13273 1726853284.13879: stdout chunk (state=3): >>><<< 13273 1726853284.13882: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853284.13885: handler run complete 13273 1726853284.13993: variable 'ansible_facts' from source: unknown 13273 1726853284.14286: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853284.17149: variable 'ansible_facts' from source: unknown 13273 1726853284.17441: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853284.17577: attempt loop complete, returning result 13273 1726853284.17646: _execute() done 13273 1726853284.17655: dumping result to json 13273 1726853284.17779: done dumping result, returning 13273 1726853284.17794: done running TaskExecutor() for managed_node3/TASK: Gathering Facts [02083763-bbaf-5fc3-657d-0000000001bc] 13273 1726853284.17812: sending task result for task 02083763-bbaf-5fc3-657d-0000000001bc 13273 1726853284.18578: done sending task result for task 02083763-bbaf-5fc3-657d-0000000001bc 13273 1726853284.18581: WORKER PROCESS EXITING ok: [managed_node3] 13273 1726853284.19070: no more pending results, returning what we have 13273 1726853284.19076: results queue empty 13273 1726853284.19077: checking for any_errors_fatal 13273 1726853284.19078: done checking for any_errors_fatal 13273 1726853284.19079: checking for max_fail_percentage 13273 1726853284.19080: done checking for max_fail_percentage 13273 1726853284.19081: checking to see if all hosts have failed and the running result is not ok 13273 1726853284.19082: done checking to see if all hosts have failed 13273 1726853284.19083: getting the remaining hosts for this loop 13273 1726853284.19084: done getting the remaining hosts for this loop 13273 1726853284.19088: getting the next task for host managed_node3 13273 1726853284.19095: done getting next task for host managed_node3 13273 1726853284.19096: ^ task is: TASK: meta (flush_handlers) 13273 1726853284.19098: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853284.19102: getting variables 13273 1726853284.19104: in VariableManager get_vars() 13273 1726853284.19132: Calling all_inventory to load vars for managed_node3 13273 1726853284.19135: Calling groups_inventory to load vars for managed_node3 13273 1726853284.19139: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853284.19150: Calling all_plugins_play to load vars for managed_node3 13273 1726853284.19153: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853284.19156: Calling groups_plugins_play to load vars for managed_node3 13273 1726853284.19960: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853284.20401: done with get_vars() 13273 1726853284.20414: done getting variables 13273 1726853284.20484: in VariableManager get_vars() 13273 1726853284.20495: Calling all_inventory to load vars for managed_node3 13273 1726853284.20497: Calling groups_inventory to load vars for managed_node3 13273 1726853284.20500: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853284.20505: Calling all_plugins_play to load vars for managed_node3 13273 1726853284.20507: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853284.20510: Calling groups_plugins_play to load vars for managed_node3 13273 1726853284.20846: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853284.21599: done with get_vars() 13273 1726853284.21614: done queuing things up, now waiting for results queue to drain 13273 1726853284.21616: results queue empty 13273 1726853284.21617: checking for any_errors_fatal 13273 1726853284.21619: done checking for any_errors_fatal 13273 1726853284.21620: checking for max_fail_percentage 13273 1726853284.21625: done checking for max_fail_percentage 13273 1726853284.21626: checking to see if all hosts have failed and the running result is not ok 13273 1726853284.21627: done checking to see if all hosts have failed 13273 1726853284.21628: getting the remaining hosts for this loop 13273 1726853284.21629: done getting the remaining hosts for this loop 13273 1726853284.21631: getting the next task for host managed_node3 13273 1726853284.21636: done getting next task for host managed_node3 13273 1726853284.21638: ^ task is: TASK: Include the task 'el_repo_setup.yml' 13273 1726853284.21640: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853284.21642: getting variables 13273 1726853284.21643: in VariableManager get_vars() 13273 1726853284.21651: Calling all_inventory to load vars for managed_node3 13273 1726853284.21653: Calling groups_inventory to load vars for managed_node3 13273 1726853284.21656: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853284.21661: Calling all_plugins_play to load vars for managed_node3 13273 1726853284.21663: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853284.21666: Calling groups_plugins_play to load vars for managed_node3 13273 1726853284.21999: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853284.22426: done with get_vars() 13273 1726853284.22435: done getting variables TASK [Include the task 'el_repo_setup.yml'] ************************************ task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_removal_nm.yml:11 Friday 20 September 2024 13:28:04 -0400 (0:00:02.101) 0:00:02.114 ****** 13273 1726853284.22516: entering _queue_task() for managed_node3/include_tasks 13273 1726853284.22518: Creating lock for include_tasks 13273 1726853284.23256: worker is 1 (out of 1 available) 13273 1726853284.23269: exiting _queue_task() for managed_node3/include_tasks 13273 1726853284.23283: done queuing things up, now waiting for results queue to drain 13273 1726853284.23285: waiting for pending results... 13273 1726853284.23910: running TaskExecutor() for managed_node3/TASK: Include the task 'el_repo_setup.yml' 13273 1726853284.23950: in run() - task 02083763-bbaf-5fc3-657d-000000000006 13273 1726853284.23973: variable 'ansible_search_path' from source: unknown 13273 1726853284.24047: calling self._execute() 13273 1726853284.24336: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853284.24340: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853284.24342: variable 'omit' from source: magic vars 13273 1726853284.24460: _execute() done 13273 1726853284.24584: dumping result to json 13273 1726853284.24593: done dumping result, returning 13273 1726853284.24606: done running TaskExecutor() for managed_node3/TASK: Include the task 'el_repo_setup.yml' [02083763-bbaf-5fc3-657d-000000000006] 13273 1726853284.24617: sending task result for task 02083763-bbaf-5fc3-657d-000000000006 13273 1726853284.25077: done sending task result for task 02083763-bbaf-5fc3-657d-000000000006 13273 1726853284.25080: WORKER PROCESS EXITING 13273 1726853284.25126: no more pending results, returning what we have 13273 1726853284.25131: in VariableManager get_vars() 13273 1726853284.25165: Calling all_inventory to load vars for managed_node3 13273 1726853284.25168: Calling groups_inventory to load vars for managed_node3 13273 1726853284.25173: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853284.25187: Calling all_plugins_play to load vars for managed_node3 13273 1726853284.25190: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853284.25193: Calling groups_plugins_play to load vars for managed_node3 13273 1726853284.25602: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853284.25979: done with get_vars() 13273 1726853284.25987: variable 'ansible_search_path' from source: unknown 13273 1726853284.26003: we have included files to process 13273 1726853284.26004: generating all_blocks data 13273 1726853284.26005: done generating all_blocks data 13273 1726853284.26005: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 13273 1726853284.26007: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 13273 1726853284.26009: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 13273 1726853284.27233: in VariableManager get_vars() 13273 1726853284.27248: done with get_vars() 13273 1726853284.27259: done processing included file 13273 1726853284.27262: iterating over new_blocks loaded from include file 13273 1726853284.27263: in VariableManager get_vars() 13273 1726853284.27476: done with get_vars() 13273 1726853284.27479: filtering new block on tags 13273 1726853284.27494: done filtering new block on tags 13273 1726853284.27497: in VariableManager get_vars() 13273 1726853284.27508: done with get_vars() 13273 1726853284.27510: filtering new block on tags 13273 1726853284.27527: done filtering new block on tags 13273 1726853284.27530: in VariableManager get_vars() 13273 1726853284.27541: done with get_vars() 13273 1726853284.27542: filtering new block on tags 13273 1726853284.27555: done filtering new block on tags 13273 1726853284.27557: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml for managed_node3 13273 1726853284.27563: extending task lists for all hosts with included blocks 13273 1726853284.27612: done extending task lists 13273 1726853284.27614: done processing included files 13273 1726853284.27614: results queue empty 13273 1726853284.27615: checking for any_errors_fatal 13273 1726853284.27617: done checking for any_errors_fatal 13273 1726853284.27617: checking for max_fail_percentage 13273 1726853284.27618: done checking for max_fail_percentage 13273 1726853284.27619: checking to see if all hosts have failed and the running result is not ok 13273 1726853284.27619: done checking to see if all hosts have failed 13273 1726853284.27620: getting the remaining hosts for this loop 13273 1726853284.27621: done getting the remaining hosts for this loop 13273 1726853284.27623: getting the next task for host managed_node3 13273 1726853284.27627: done getting next task for host managed_node3 13273 1726853284.27629: ^ task is: TASK: Gather the minimum subset of ansible_facts required by the network role test 13273 1726853284.27631: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853284.27633: getting variables 13273 1726853284.27634: in VariableManager get_vars() 13273 1726853284.27642: Calling all_inventory to load vars for managed_node3 13273 1726853284.27644: Calling groups_inventory to load vars for managed_node3 13273 1726853284.27646: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853284.27651: Calling all_plugins_play to load vars for managed_node3 13273 1726853284.27653: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853284.27655: Calling groups_plugins_play to load vars for managed_node3 13273 1726853284.28214: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853284.28386: done with get_vars() 13273 1726853284.28395: done getting variables TASK [Gather the minimum subset of ansible_facts required by the network role test] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Friday 20 September 2024 13:28:04 -0400 (0:00:00.059) 0:00:02.173 ****** 13273 1726853284.28460: entering _queue_task() for managed_node3/setup 13273 1726853284.29395: worker is 1 (out of 1 available) 13273 1726853284.29409: exiting _queue_task() for managed_node3/setup 13273 1726853284.29423: done queuing things up, now waiting for results queue to drain 13273 1726853284.29424: waiting for pending results... 13273 1726853284.29684: running TaskExecutor() for managed_node3/TASK: Gather the minimum subset of ansible_facts required by the network role test 13273 1726853284.30079: in run() - task 02083763-bbaf-5fc3-657d-0000000001cd 13273 1726853284.30083: variable 'ansible_search_path' from source: unknown 13273 1726853284.30086: variable 'ansible_search_path' from source: unknown 13273 1726853284.30088: calling self._execute() 13273 1726853284.30266: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853284.30282: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853284.30297: variable 'omit' from source: magic vars 13273 1726853284.31202: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13273 1726853284.35950: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13273 1726853284.35955: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13273 1726853284.36078: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13273 1726853284.36107: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13273 1726853284.36138: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13273 1726853284.36229: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853284.36410: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853284.36439: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853284.36496: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853284.36596: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853284.36935: variable 'ansible_facts' from source: unknown 13273 1726853284.37008: variable 'network_test_required_facts' from source: task vars 13273 1726853284.37178: Evaluated conditional (not ansible_facts.keys() | list | intersect(network_test_required_facts) == network_test_required_facts): True 13273 1726853284.37190: variable 'omit' from source: magic vars 13273 1726853284.37231: variable 'omit' from source: magic vars 13273 1726853284.37461: variable 'omit' from source: magic vars 13273 1726853284.37465: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13273 1726853284.37467: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13273 1726853284.37469: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13273 1726853284.37778: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853284.37781: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853284.37784: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13273 1726853284.37786: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853284.37788: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853284.37843: Set connection var ansible_connection to ssh 13273 1726853284.37858: Set connection var ansible_module_compression to ZIP_DEFLATED 13273 1726853284.37867: Set connection var ansible_shell_executable to /bin/sh 13273 1726853284.37876: Set connection var ansible_shell_type to sh 13273 1726853284.37891: Set connection var ansible_pipelining to False 13273 1726853284.37901: Set connection var ansible_timeout to 10 13273 1726853284.37934: variable 'ansible_shell_executable' from source: unknown 13273 1726853284.37999: variable 'ansible_connection' from source: unknown 13273 1726853284.38007: variable 'ansible_module_compression' from source: unknown 13273 1726853284.38014: variable 'ansible_shell_type' from source: unknown 13273 1726853284.38021: variable 'ansible_shell_executable' from source: unknown 13273 1726853284.38027: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853284.38034: variable 'ansible_pipelining' from source: unknown 13273 1726853284.38040: variable 'ansible_timeout' from source: unknown 13273 1726853284.38110: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853284.38376: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 13273 1726853284.38380: variable 'omit' from source: magic vars 13273 1726853284.38383: starting attempt loop 13273 1726853284.38385: running the handler 13273 1726853284.38431: _low_level_execute_command(): starting 13273 1726853284.38643: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13273 1726853284.39880: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13273 1726853284.39902: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853284.39974: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853284.40005: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853284.40100: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853284.42019: stdout chunk (state=3): >>>/root <<< 13273 1726853284.42023: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853284.42027: stdout chunk (state=3): >>><<< 13273 1726853284.42029: stderr chunk (state=3): >>><<< 13273 1726853284.42126: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853284.42137: _low_level_execute_command(): starting 13273 1726853284.42140: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853284.420525-13446-267272453182913 `" && echo ansible-tmp-1726853284.420525-13446-267272453182913="` echo /root/.ansible/tmp/ansible-tmp-1726853284.420525-13446-267272453182913 `" ) && sleep 0' 13273 1726853284.43482: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853284.43653: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853284.43827: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853284.43892: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853284.45890: stdout chunk (state=3): >>>ansible-tmp-1726853284.420525-13446-267272453182913=/root/.ansible/tmp/ansible-tmp-1726853284.420525-13446-267272453182913 <<< 13273 1726853284.46048: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853284.46297: stdout chunk (state=3): >>><<< 13273 1726853284.46300: stderr chunk (state=3): >>><<< 13273 1726853284.46303: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853284.420525-13446-267272453182913=/root/.ansible/tmp/ansible-tmp-1726853284.420525-13446-267272453182913 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853284.46305: variable 'ansible_module_compression' from source: unknown 13273 1726853284.46514: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13273pukop8ph/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 13273 1726853284.46588: variable 'ansible_facts' from source: unknown 13273 1726853284.47083: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853284.420525-13446-267272453182913/AnsiballZ_setup.py 13273 1726853284.47422: Sending initial data 13273 1726853284.47515: Sent initial data (153 bytes) 13273 1726853284.48650: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13273 1726853284.48990: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853284.49007: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853284.49103: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853284.50790: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13273 1726853284.50860: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13273 1726853284.50921: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13273pukop8ph/tmpnj7usasf /root/.ansible/tmp/ansible-tmp-1726853284.420525-13446-267272453182913/AnsiballZ_setup.py <<< 13273 1726853284.50941: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853284.420525-13446-267272453182913/AnsiballZ_setup.py" <<< 13273 1726853284.50992: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13273pukop8ph/tmpnj7usasf" to remote "/root/.ansible/tmp/ansible-tmp-1726853284.420525-13446-267272453182913/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853284.420525-13446-267272453182913/AnsiballZ_setup.py" <<< 13273 1726853284.55064: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853284.55069: stdout chunk (state=3): >>><<< 13273 1726853284.55078: stderr chunk (state=3): >>><<< 13273 1726853284.55080: done transferring module to remote 13273 1726853284.55261: _low_level_execute_command(): starting 13273 1726853284.55265: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853284.420525-13446-267272453182913/ /root/.ansible/tmp/ansible-tmp-1726853284.420525-13446-267272453182913/AnsiballZ_setup.py && sleep 0' 13273 1726853284.56454: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13273 1726853284.56554: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853284.56599: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853284.56912: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853284.57362: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853284.57863: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853284.59796: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853284.59800: stdout chunk (state=3): >>><<< 13273 1726853284.59803: stderr chunk (state=3): >>><<< 13273 1726853284.59822: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853284.59892: _low_level_execute_command(): starting 13273 1726853284.59903: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853284.420525-13446-267272453182913/AnsiballZ_setup.py && sleep 0' 13273 1726853284.61126: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13273 1726853284.61378: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853284.61418: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853284.61436: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853284.61449: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853284.61554: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 13273 1726853284.64169: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # <<< 13273 1726853284.64197: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook import 'time' # <<< 13273 1726853284.64285: stdout chunk (state=3): >>>import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py <<< 13273 1726853284.64289: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 13273 1726853284.64378: stdout chunk (state=3): >>>import '_codecs' # <<< 13273 1726853284.64384: stdout chunk (state=3): >>>import 'codecs' # <<< 13273 1726853284.64389: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 13273 1726853284.64556: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36eedb44d0> <<< 13273 1726853284.64560: stdout chunk (state=3): >>>import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36eed83b00> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36eedb6a50> <<< 13273 1726853284.64583: stdout chunk (state=3): >>>import '_signal' # import '_abc' # import 'abc' # import 'io' # <<< 13273 1726853284.64673: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <<< 13273 1726853284.64712: stdout chunk (state=3): >>>import '_collections_abc' # <<< 13273 1726853284.64751: stdout chunk (state=3): >>>import 'genericpath' # import 'posixpath' # <<< 13273 1726853284.64790: stdout chunk (state=3): >>>import 'os' # <<< 13273 1726853284.64819: stdout chunk (state=3): >>>import '_sitebuiltins' # Processing user site-packages <<< 13273 1726853284.64848: stdout chunk (state=3): >>>Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' <<< 13273 1726853284.64879: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' <<< 13273 1726853284.65099: stdout chunk (state=3): >>>import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36eeb65130> <<< 13273 1726853284.65119: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36eeb65fa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 13273 1726853284.65760: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' <<< 13273 1726853284.65852: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 13273 1726853284.65885: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' <<< 13273 1726853284.65908: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36eeba3e60> <<< 13273 1726853284.65925: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py <<< 13273 1726853284.65969: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36eeba3ef0> <<< 13273 1726853284.65999: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 13273 1726853284.66028: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 13273 1726853284.66116: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 13273 1726853284.66132: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 13273 1726853284.66153: stdout chunk (state=3): >>>import 'itertools' # <<< 13273 1726853284.66214: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36eebdb860> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py <<< 13273 1726853284.66293: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36eebdbef0> import '_collections' # <<< 13273 1726853284.66353: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36eebbbb30> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36eebb9220> <<< 13273 1726853284.66563: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36eeba1010> <<< 13273 1726853284.66566: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # <<< 13273 1726853284.66626: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 13273 1726853284.66659: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 13273 1726853284.66700: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36eebfb7a0> <<< 13273 1726853284.66761: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36eebfa3c0> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36eebba0f0> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36eeba28d0> <<< 13273 1726853284.66837: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' <<< 13273 1726853284.66849: stdout chunk (state=3): >>>import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36eec307d0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36eeba0290> <<< 13273 1726853284.66989: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36eec30c80> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36eec30b30> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' <<< 13273 1726853284.67023: stdout chunk (state=3): >>># extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36eec30f20> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36eeb9edb0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 13273 1726853284.67084: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36eec315e0> <<< 13273 1726853284.67132: stdout chunk (state=3): >>>import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36eec312b0> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' <<< 13273 1726853284.67184: stdout chunk (state=3): >>>import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36eec324b0> import 'importlib.util' # import 'runpy' # <<< 13273 1726853284.67326: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36eec486b0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' <<< 13273 1726853284.67361: stdout chunk (state=3): >>># extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' <<< 13273 1726853284.67376: stdout chunk (state=3): >>>import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36eec49d60> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' <<< 13273 1726853284.67397: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' <<< 13273 1726853284.67487: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36eec4ac00> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36eec4b260> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36eec4a150> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 13273 1726853284.67548: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36eec4bce0> <<< 13273 1726853284.67618: stdout chunk (state=3): >>>import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36eec4b410> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36eec32420> <<< 13273 1726853284.67676: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 13273 1726853284.67703: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 13273 1726853284.67761: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 13273 1726853284.67862: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36ee93bc50> <<< 13273 1726853284.67865: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 13273 1726853284.67892: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36ee9646b0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36ee964410> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36ee9646e0> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py <<< 13273 1726853284.67992: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 13273 1726853284.68165: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36ee965010> <<< 13273 1726853284.68405: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36ee965a00> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36ee9648c0> <<< 13273 1726853284.68439: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36ee939df0> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 13273 1726853284.68576: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36ee966e10> <<< 13273 1726853284.68601: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36ee965b50> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36eec32bd0> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 13273 1726853284.68660: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 13273 1726853284.68705: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 13273 1726853284.68731: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36ee98f1a0> <<< 13273 1726853284.68829: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 13273 1726853284.68861: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 13273 1726853284.68936: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36ee9b3500> <<< 13273 1726853284.68953: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 13273 1726853284.69128: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36eea14230> <<< 13273 1726853284.69162: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 13273 1726853284.69257: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 13273 1726853284.69262: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 13273 1726853284.69284: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 13273 1726853284.69386: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36eea16990> <<< 13273 1726853284.69536: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36eea14350> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36ee9d9250> <<< 13273 1726853284.69694: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36ee325370> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36ee9b2300> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36ee967d70> <<< 13273 1726853284.69926: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 13273 1726853284.69940: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f36ee325610> <<< 13273 1726853284.70236: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_setup_payload_nodzs2wz/ansible_setup_payload.zip' # zipimport: zlib available <<< 13273 1726853284.70586: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 13273 1726853284.70603: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 13273 1726853284.70663: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 13273 1726853284.70685: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' <<< 13273 1726853284.70698: stdout chunk (state=3): >>>import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36ee38f0b0> <<< 13273 1726853284.70713: stdout chunk (state=3): >>>import '_typing' # <<< 13273 1726853284.70987: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36ee36dfa0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36ee36d130> <<< 13273 1726853284.71195: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # <<< 13273 1726853284.71288: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853284.73428: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853284.75175: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py<<< 13273 1726853284.75218: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36ee38ce00><<< 13273 1726853284.75221: stdout chunk (state=3): >>> <<< 13273 1726853284.75356: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py <<< 13273 1726853284.75359: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 13273 1726853284.75392: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py<<< 13273 1726853284.75504: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so'<<< 13273 1726853284.75507: stdout chunk (state=3): >>> import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36ee3bea50><<< 13273 1726853284.75566: stdout chunk (state=3): >>> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36ee3be7e0> <<< 13273 1726853284.75617: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36ee3be0f0><<< 13273 1726853284.75668: stdout chunk (state=3): >>> <<< 13273 1726853284.75764: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 13273 1726853284.75767: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36ee3be540> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36ee38fb30><<< 13273 1726853284.76074: stdout chunk (state=3): >>> import 'atexit' # <<< 13273 1726853284.76078: stdout chunk (state=3): >>> # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so'<<< 13273 1726853284.76081: stdout chunk (state=3): >>> # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so'<<< 13273 1726853284.76106: stdout chunk (state=3): >>> import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36ee3bf7a0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36ee3bf9e0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # <<< 13273 1726853284.76151: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36ee3bfef0> import 'pwd' # <<< 13273 1726853284.76185: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py<<< 13273 1726853284.76207: stdout chunk (state=3): >>> <<< 13273 1726853284.76237: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc'<<< 13273 1726853284.76296: stdout chunk (state=3): >>> import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36ee229d00><<< 13273 1726853284.76329: stdout chunk (state=3): >>> <<< 13273 1726853284.76437: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36ee22b920> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc'<<< 13273 1726853284.76499: stdout chunk (state=3): >>> import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36ee22c2f0> <<< 13273 1726853284.76534: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py<<< 13273 1726853284.76555: stdout chunk (state=3): >>> <<< 13273 1726853284.76656: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36ee22d490> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py<<< 13273 1726853284.76728: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 13273 1726853284.76763: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py<<< 13273 1726853284.76789: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc'<<< 13273 1726853284.76874: stdout chunk (state=3): >>> <<< 13273 1726853284.76886: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36ee22ff20><<< 13273 1726853284.76945: stdout chunk (state=3): >>> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' <<< 13273 1726853284.76975: stdout chunk (state=3): >>># extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36eeb9eea0><<< 13273 1726853284.77012: stdout chunk (state=3): >>> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36ee22e1e0> <<< 13273 1726853284.77091: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 13273 1726853284.77128: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py <<< 13273 1726853284.77204: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 13273 1726853284.77395: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py <<< 13273 1726853284.77481: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36ee237e60><<< 13273 1726853284.77495: stdout chunk (state=3): >>> import '_tokenize' # <<< 13273 1726853284.77574: stdout chunk (state=3): >>> import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36ee236930> <<< 13273 1726853284.77620: stdout chunk (state=3): >>>import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36ee2366c0> <<< 13273 1726853284.77737: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 13273 1726853284.77780: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36ee236c00><<< 13273 1726853284.77795: stdout chunk (state=3): >>> <<< 13273 1726853284.77836: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36ee22e6f0><<< 13273 1726853284.77961: stdout chunk (state=3): >>> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36ee27bf80> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36ee27c1d0><<< 13273 1726853284.77997: stdout chunk (state=3): >>> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py<<< 13273 1726853284.78015: stdout chunk (state=3): >>> <<< 13273 1726853284.78034: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc'<<< 13273 1726853284.78092: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py <<< 13273 1726853284.78095: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc'<<< 13273 1726853284.78149: stdout chunk (state=3): >>> # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' <<< 13273 1726853284.78184: stdout chunk (state=3): >>># extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36ee27dd90><<< 13273 1726853284.78210: stdout chunk (state=3): >>> <<< 13273 1726853284.78229: stdout chunk (state=3): >>>import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36ee27db50> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 13273 1726853284.78279: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc'<<< 13273 1726853284.78362: stdout chunk (state=3): >>> # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' <<< 13273 1726853284.78412: stdout chunk (state=3): >>># extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' <<< 13273 1726853284.78513: stdout chunk (state=3): >>>import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36ee280350> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36ee27e480> <<< 13273 1726853284.78541: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 13273 1726853284.78567: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py <<< 13273 1726853284.78693: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36ee283a40> <<< 13273 1726853284.78864: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36ee280410> <<< 13273 1726853284.78963: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so'<<< 13273 1726853284.79062: stdout chunk (state=3): >>> # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36ee284830> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so'<<< 13273 1726853284.79082: stdout chunk (state=3): >>> # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36ee284c50> <<< 13273 1726853284.79130: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so'<<< 13273 1726853284.79181: stdout chunk (state=3): >>> # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36ee284b90> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36ee27c4d0> <<< 13273 1726853284.79214: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc'<<< 13273 1726853284.79251: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 13273 1726853284.79299: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 13273 1726853284.79603: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 13273 1726853284.79606: stdout chunk (state=3): >>>import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36ee10c470><<< 13273 1726853284.79608: stdout chunk (state=3): >>> <<< 13273 1726853284.79660: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 13273 1726853284.79690: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 13273 1726853284.79719: stdout chunk (state=3): >>>import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36ee10d520> <<< 13273 1726853284.79738: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36ee286c30> <<< 13273 1726853284.79775: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so'<<< 13273 1726853284.79946: stdout chunk (state=3): >>> # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' <<< 13273 1726853284.79949: stdout chunk (state=3): >>>import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36ee287f80> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36ee286840> <<< 13273 1726853284.79967: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available <<< 13273 1726853284.80106: stdout chunk (state=3): >>># zipimport: zlib available<<< 13273 1726853284.80122: stdout chunk (state=3): >>> <<< 13273 1726853284.80272: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available<<< 13273 1726853284.80287: stdout chunk (state=3): >>> import 'ansible.module_utils.common' # <<< 13273 1726853284.80302: stdout chunk (state=3): >>> # zipimport: zlib available<<< 13273 1726853284.80329: stdout chunk (state=3): >>> # zipimport: zlib available <<< 13273 1726853284.80374: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text' # # zipimport: zlib available <<< 13273 1726853284.80760: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available<<< 13273 1726853284.80773: stdout chunk (state=3): >>> <<< 13273 1726853284.81725: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853284.82670: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # <<< 13273 1726853284.82709: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # <<< 13273 1726853284.82790: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py<<< 13273 1726853284.82823: stdout chunk (state=3): >>> <<< 13273 1726853284.83076: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 13273 1726853284.83086: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' <<< 13273 1726853284.83089: stdout chunk (state=3): >>># extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36ee1157c0><<< 13273 1726853284.83091: stdout chunk (state=3): >>> <<< 13273 1726853284.83093: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py<<< 13273 1726853284.83220: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36ee116510> <<< 13273 1726853284.83224: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36ee10d730> import 'ansible.module_utils.compat.selinux' # <<< 13273 1726853284.83236: stdout chunk (state=3): >>> <<< 13273 1726853284.83263: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853284.83298: stdout chunk (state=3): >>># zipimport: zlib available<<< 13273 1726853284.83305: stdout chunk (state=3): >>> <<< 13273 1726853284.83340: stdout chunk (state=3): >>>import 'ansible.module_utils._text' # <<< 13273 1726853284.83346: stdout chunk (state=3): >>> <<< 13273 1726853284.83387: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853284.83648: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853284.83830: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py <<< 13273 1726853284.83973: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36ee116600> # zipimport: zlib available <<< 13273 1726853284.84648: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853284.85372: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853284.85513: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853284.85582: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 13273 1726853284.85748: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available <<< 13273 1726853284.85798: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853284.85964: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # # zipimport: zlib available <<< 13273 1726853284.85968: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.parsing' # <<< 13273 1726853284.86057: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 13273 1726853284.86094: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 13273 1726853284.86106: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853284.86486: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853284.86865: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 13273 1726853284.87093: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36ee117830> # zipimport: zlib available <<< 13273 1726853284.87250: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853284.87312: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # <<< 13273 1726853284.87335: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853284.87487: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available <<< 13273 1726853284.87525: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853284.87592: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853284.87712: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853284.87766: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 13273 1726853284.87830: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 13273 1726853284.87960: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36ee122180> <<< 13273 1726853284.88149: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36ee11d130> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available <<< 13273 1726853284.88168: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853284.88249: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853284.88514: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 13273 1726853284.88518: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 13273 1726853284.88697: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36ee20aab0> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36ee3ea780> <<< 13273 1726853284.88825: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36ee122240> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36ee114950> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # <<< 13273 1726853284.88848: stdout chunk (state=3): >>> # zipimport: zlib available <<< 13273 1726853284.88881: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853284.88913: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 13273 1726853284.88997: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # <<< 13273 1726853284.89024: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853284.89164: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available <<< 13273 1726853284.89237: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853284.89272: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853284.89289: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853284.89347: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853284.89406: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853284.89456: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853284.89594: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available <<< 13273 1726853284.89691: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853284.89829: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available <<< 13273 1726853284.90145: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853284.90386: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853284.90435: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853284.90504: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' <<< 13273 1726853284.90591: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py <<< 13273 1726853284.90698: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36ee1b24b0> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py <<< 13273 1726853284.90755: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' <<< 13273 1726853284.90803: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py <<< 13273 1726853284.90904: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36edd68230> <<< 13273 1726853284.91005: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36edd68560> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36ee19f4a0> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36ee1b2ff0> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36ee1b0bf0> <<< 13273 1726853284.91033: stdout chunk (state=3): >>>import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36ee1b07d0> <<< 13273 1726853284.91046: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py <<< 13273 1726853284.91125: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' <<< 13273 1726853284.91158: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' <<< 13273 1726853284.91197: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' <<< 13273 1726853284.91248: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36edd6b590> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36edd6ae40> <<< 13273 1726853284.91286: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36edd6aff0> <<< 13273 1726853284.91365: stdout chunk (state=3): >>>import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36edd6a270> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 13273 1726853284.91521: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' <<< 13273 1726853284.91702: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36edd6b710> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36eddca210> <<< 13273 1726853284.91744: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36eddc8230> <<< 13273 1726853284.91786: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36ee1b0980> import 'ansible.module_utils.facts.timeout' # <<< 13273 1726853284.92006: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.collector' # <<< 13273 1726853284.92013: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available <<< 13273 1726853284.92092: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.facter' # <<< 13273 1726853284.92123: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853284.92194: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853284.92267: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.ohai' # <<< 13273 1726853284.92314: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 13273 1726853284.92352: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system' # # zipimport: zlib available <<< 13273 1726853284.92599: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available <<< 13273 1726853284.92655: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853284.92716: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.chroot' # <<< 13273 1726853284.92734: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853284.92821: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853284.92913: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853284.92993: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853284.93084: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # <<< 13273 1726853284.93105: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.cmdline' # <<< 13273 1726853284.93286: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853284.93936: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853284.94655: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # <<< 13273 1726853284.94679: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853284.94769: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853284.94848: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853284.94899: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853284.94992: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available <<< 13273 1726853284.95049: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853284.95104: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.env' # <<< 13273 1726853284.95125: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853284.95217: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853284.95296: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.dns' # <<< 13273 1726853284.95328: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853284.95377: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853284.95440: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.fips' # <<< 13273 1726853284.95459: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853284.95693: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available <<< 13273 1726853284.95811: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py <<< 13273 1726853284.95857: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' <<< 13273 1726853284.95887: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36eddcbda0> <<< 13273 1726853284.95932: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py <<< 13273 1726853284.95974: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 13273 1726853284.96150: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36eddcaf30> <<< 13273 1726853284.96202: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available <<< 13273 1726853284.96386: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853284.96405: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # <<< 13273 1726853284.96425: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853284.96565: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853284.96705: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # <<< 13273 1726853284.96737: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853284.96851: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853284.96967: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available <<< 13273 1726853284.97079: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853284.97201: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 13273 1726853284.97301: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 13273 1726853284.97493: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36ede06360> <<< 13273 1726853284.97735: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36ee2837a0> <<< 13273 1726853284.97756: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.python' # <<< 13273 1726853284.97758: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853284.97853: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853284.97938: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.selinux' # <<< 13273 1726853284.97968: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853284.98105: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853284.98237: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853284.98525: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853284.98654: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # <<< 13273 1726853284.98693: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available <<< 13273 1726853284.98719: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853284.98797: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available <<< 13273 1726853284.98866: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853284.99101: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36ede1a150> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36ede19d30> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available <<< 13273 1726853284.99186: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.base' # <<< 13273 1726853284.99198: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853284.99465: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853284.99688: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # <<< 13273 1726853284.99712: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853284.99867: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853285.00029: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853285.00095: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853285.00167: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # <<< 13273 1726853285.00193: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.darwin' # <<< 13273 1726853285.00210: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853285.00236: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853285.00268: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853285.00485: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853285.00713: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # <<< 13273 1726853285.00717: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.dragonfly' # <<< 13273 1726853285.00743: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853285.00927: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853285.01197: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available <<< 13273 1726853285.01237: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853285.02199: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853285.02977: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # <<< 13273 1726853285.03027: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hurd' # <<< 13273 1726853285.03030: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853285.03205: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853285.03362: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # <<< 13273 1726853285.03590: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 13273 1726853285.03698: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # <<< 13273 1726853285.03710: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853285.03972: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853285.04230: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # <<< 13273 1726853285.04263: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # <<< 13273 1726853285.04301: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853285.04360: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853285.04445: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.base' # <<< 13273 1726853285.04463: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853285.04595: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853285.04756: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853285.05096: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853285.05438: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # <<< 13273 1726853285.05443: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.aix' # <<< 13273 1726853285.05473: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853285.05514: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853285.05585: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.darwin' # <<< 13273 1726853285.05625: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853285.05628: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853285.05673: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.dragonfly' # <<< 13273 1726853285.05703: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853285.05813: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853285.05918: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # <<< 13273 1726853285.06011: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # <<< 13273 1726853285.06014: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853285.06105: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853285.06202: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hpux' # <<< 13273 1726853285.06216: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853285.06299: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853285.06423: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # <<< 13273 1726853285.06491: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853285.07109: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853285.07294: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available <<< 13273 1726853285.07386: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853285.07484: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.iscsi' # <<< 13273 1726853285.07497: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853285.07594: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853285.07630: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.nvme' # <<< 13273 1726853285.07633: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853285.07689: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853285.07725: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.netbsd' # <<< 13273 1726853285.07750: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853285.07814: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853285.08088: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.openbsd' # <<< 13273 1726853285.08123: stdout chunk (state=3): >>> # zipimport: zlib available # zipimport: zlib available <<< 13273 1726853285.08155: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # <<< 13273 1726853285.08195: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # <<< 13273 1726853285.08387: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 13273 1726853285.08449: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853285.08516: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853285.08627: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853285.08744: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # <<< 13273 1726853285.08774: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.freebsd' # <<< 13273 1726853285.08799: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available <<< 13273 1726853285.08856: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853285.08928: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.hpux' # <<< 13273 1726853285.08948: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853285.09296: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853285.09622: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # <<< 13273 1726853285.09689: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853285.09742: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853285.09832: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available <<< 13273 1726853285.09901: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853285.09997: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available <<< 13273 1726853285.10216: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853285.10255: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # <<< 13273 1726853285.10259: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.default_collectors' # <<< 13273 1726853285.10261: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853285.10591: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 13273 1726853285.10644: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853285.10984: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' <<< 13273 1726853285.11018: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' <<< 13273 1726853285.11040: stdout chunk (state=3): >>># extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36edc17290> <<< 13273 1726853285.11097: stdout chunk (state=3): >>>import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36edc17aa0> <<< 13273 1726853285.11205: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36edc16f30> <<< 13273 1726853285.12887: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCoYDjPBT+oZMoH+Iz89VMad3pzgkIKqOuUO8QZPyEpVfgGNjVOaglgkXLQyOXulclB6EA4nlBVmVP6IyWL+N4gskjf5Qmm5n+WHu3amXXm9l66v+yKWruV18sgIn8o6iAdCgZrJFzdFVNeKRLYV6P67syyegqRkOJe7U2m/rxA967Vm6wcrwJN8eiZc8tbx1lHOuJNCcP20ThNxMHIPfvT2im8rlt/ojf6e+Z1axRnvFBubBg1qDfxHEX6AlxMHl+CIOXxGHxsvSxfLq7lhrXrComBSxTDv+fmHeJkck3VGck2rn8Hz3eBTty453RU3pq6KxdqU1GB+Z+VYHInXEtXb2i38QpLqRfiLqwUxdjkFKqz0j5z/+NskQF3w8j6uz77Revs9G8eMs14sICJtnD9qBLhFuGxlR/ovZCjynVjBTKBfDaVASpjV0iCIZKSgLn6zSaM/6FBhnBoeb4ch2iiw2S8Q0aKJNKIp0AfY21IVplyS3VCu+Br3fIXzoINo0s=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBIX3PaAHBZzPq23oQJkywX/2bB39yz9ZrSLgsNfhL04NHwnY0Up/oiN+aiUte1DWFqV5wiDLJpl9a1tDLARWXNA=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIBb2CS4KVp6KVVqnGA45j7tkSkijXfGxbd3neKpDsjh9", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "28", "second": "05", "epoch": "1726853285", "epoch_int": "1726853285", "date": "2024-09-20", "time": "13:28:05", "iso8601_micro": "2024-09-20T17:28:05.107775Z", "iso8601": "2024-09-20T17:28:05Z", "iso8601_basic": "20240920T132805107775", "iso8601_basic_short": "20240920T132805", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_apparmor": {"status": "disabled"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-11-217.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-11-217", "ansible_nodename": "ip-10-31-11-217.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2a85cf21f17783c9da20681cb8e352", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_fips": false, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_lsb": {}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.46.199 55604 10.31.11.217 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.46.199 55604 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_local": {}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 13273 1726853285.13747: stdout chunk (state=3): >>># clear sys.path_importer_cache <<< 13273 1726853285.13766: stdout chunk (state=3): >>># clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix <<< 13273 1726853285.14011: stdout chunk (state=3): >>># cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time <<< 13273 1726853285.14086: stdout chunk (state=3): >>># cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process <<< 13273 1726853285.14158: stdout chunk (state=3): >>># cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd <<< 13273 1726853285.14162: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr <<< 13273 1726853285.14227: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd<<< 13273 1726853285.14253: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd<<< 13273 1726853285.14256: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna <<< 13273 1726853285.15265: stdout chunk (state=3): >>># destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata <<< 13273 1726853285.15329: stdout chunk (state=3): >>># destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime <<< 13273 1726853285.15613: stdout chunk (state=3): >>># destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib <<< 13273 1726853285.15708: stdout chunk (state=3): >>># cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 <<< 13273 1726853285.15711: stdout chunk (state=3): >>># cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys <<< 13273 1726853285.15901: stdout chunk (state=3): >>># cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 13273 1726853285.15943: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket <<< 13273 1726853285.15959: stdout chunk (state=3): >>># destroy _collections <<< 13273 1726853285.16010: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize <<< 13273 1726853285.16087: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib <<< 13273 1726853285.16115: stdout chunk (state=3): >>># destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse <<< 13273 1726853285.16135: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 13273 1726853285.16258: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading <<< 13273 1726853285.16284: stdout chunk (state=3): >>># destroy atexit <<< 13273 1726853285.16397: stdout chunk (state=3): >>># destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks <<< 13273 1726853285.16896: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 13273 1726853285.16932: stderr chunk (state=3): >>><<< 13273 1726853285.16941: stdout chunk (state=3): >>><<< 13273 1726853285.17298: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36eedb44d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36eed83b00> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36eedb6a50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36eeb65130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36eeb65fa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36eeba3e60> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36eeba3ef0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36eebdb860> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36eebdbef0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36eebbbb30> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36eebb9220> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36eeba1010> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36eebfb7a0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36eebfa3c0> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36eebba0f0> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36eeba28d0> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36eec307d0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36eeba0290> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36eec30c80> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36eec30b30> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36eec30f20> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36eeb9edb0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36eec315e0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36eec312b0> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36eec324b0> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36eec486b0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36eec49d60> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36eec4ac00> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36eec4b260> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36eec4a150> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36eec4bce0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36eec4b410> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36eec32420> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36ee93bc50> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36ee9646b0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36ee964410> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36ee9646e0> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36ee965010> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36ee965a00> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36ee9648c0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36ee939df0> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36ee966e10> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36ee965b50> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36eec32bd0> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36ee98f1a0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36ee9b3500> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36eea14230> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36eea16990> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36eea14350> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36ee9d9250> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36ee325370> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36ee9b2300> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36ee967d70> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f36ee325610> # zipimport: found 103 names in '/tmp/ansible_setup_payload_nodzs2wz/ansible_setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36ee38f0b0> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36ee36dfa0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36ee36d130> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36ee38ce00> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36ee3bea50> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36ee3be7e0> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36ee3be0f0> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36ee3be540> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36ee38fb30> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36ee3bf7a0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36ee3bf9e0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36ee3bfef0> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36ee229d00> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36ee22b920> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36ee22c2f0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36ee22d490> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36ee22ff20> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36eeb9eea0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36ee22e1e0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36ee237e60> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36ee236930> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36ee2366c0> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36ee236c00> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36ee22e6f0> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36ee27bf80> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36ee27c1d0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36ee27dd90> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36ee27db50> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36ee280350> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36ee27e480> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36ee283a40> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36ee280410> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36ee284830> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36ee284c50> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36ee284b90> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36ee27c4d0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36ee10c470> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36ee10d520> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36ee286c30> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36ee287f80> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36ee286840> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36ee1157c0> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36ee116510> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36ee10d730> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36ee116600> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36ee117830> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36ee122180> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36ee11d130> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36ee20aab0> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36ee3ea780> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36ee122240> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36ee114950> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36ee1b24b0> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36edd68230> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36edd68560> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36ee19f4a0> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36ee1b2ff0> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36ee1b0bf0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36ee1b07d0> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36edd6b590> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36edd6ae40> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36edd6aff0> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36edd6a270> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36edd6b710> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36eddca210> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36eddc8230> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36ee1b0980> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36eddcbda0> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36eddcaf30> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36ede06360> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36ee2837a0> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36ede1a150> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36ede19d30> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36edc17290> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36edc17aa0> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36edc16f30> {"ansible_facts": {"ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCoYDjPBT+oZMoH+Iz89VMad3pzgkIKqOuUO8QZPyEpVfgGNjVOaglgkXLQyOXulclB6EA4nlBVmVP6IyWL+N4gskjf5Qmm5n+WHu3amXXm9l66v+yKWruV18sgIn8o6iAdCgZrJFzdFVNeKRLYV6P67syyegqRkOJe7U2m/rxA967Vm6wcrwJN8eiZc8tbx1lHOuJNCcP20ThNxMHIPfvT2im8rlt/ojf6e+Z1axRnvFBubBg1qDfxHEX6AlxMHl+CIOXxGHxsvSxfLq7lhrXrComBSxTDv+fmHeJkck3VGck2rn8Hz3eBTty453RU3pq6KxdqU1GB+Z+VYHInXEtXb2i38QpLqRfiLqwUxdjkFKqz0j5z/+NskQF3w8j6uz77Revs9G8eMs14sICJtnD9qBLhFuGxlR/ovZCjynVjBTKBfDaVASpjV0iCIZKSgLn6zSaM/6FBhnBoeb4ch2iiw2S8Q0aKJNKIp0AfY21IVplyS3VCu+Br3fIXzoINo0s=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBIX3PaAHBZzPq23oQJkywX/2bB39yz9ZrSLgsNfhL04NHwnY0Up/oiN+aiUte1DWFqV5wiDLJpl9a1tDLARWXNA=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIBb2CS4KVp6KVVqnGA45j7tkSkijXfGxbd3neKpDsjh9", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "28", "second": "05", "epoch": "1726853285", "epoch_int": "1726853285", "date": "2024-09-20", "time": "13:28:05", "iso8601_micro": "2024-09-20T17:28:05.107775Z", "iso8601": "2024-09-20T17:28:05Z", "iso8601_basic": "20240920T132805107775", "iso8601_basic_short": "20240920T132805", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_apparmor": {"status": "disabled"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-11-217.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-11-217", "ansible_nodename": "ip-10-31-11-217.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2a85cf21f17783c9da20681cb8e352", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_fips": false, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_lsb": {}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.46.199 55604 10.31.11.217 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.46.199 55604 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_local": {}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 13273 1726853285.19193: done with _execute_module (setup, {'gather_subset': 'min', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853284.420525-13446-267272453182913/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13273 1726853285.19199: _low_level_execute_command(): starting 13273 1726853285.19202: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853284.420525-13446-267272453182913/ > /dev/null 2>&1 && sleep 0' 13273 1726853285.19577: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853285.19593: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 13273 1726853285.19632: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853285.19677: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853285.19780: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853285.19838: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853285.19926: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853285.20035: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 13273 1726853285.22886: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853285.22894: stdout chunk (state=3): >>><<< 13273 1726853285.22897: stderr chunk (state=3): >>><<< 13273 1726853285.22899: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 13273 1726853285.22902: handler run complete 13273 1726853285.22904: variable 'ansible_facts' from source: unknown 13273 1726853285.22906: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853285.22994: variable 'ansible_facts' from source: unknown 13273 1726853285.23036: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853285.23085: attempt loop complete, returning result 13273 1726853285.23088: _execute() done 13273 1726853285.23091: dumping result to json 13273 1726853285.23102: done dumping result, returning 13273 1726853285.23111: done running TaskExecutor() for managed_node3/TASK: Gather the minimum subset of ansible_facts required by the network role test [02083763-bbaf-5fc3-657d-0000000001cd] 13273 1726853285.23114: sending task result for task 02083763-bbaf-5fc3-657d-0000000001cd 13273 1726853285.23311: done sending task result for task 02083763-bbaf-5fc3-657d-0000000001cd 13273 1726853285.23314: WORKER PROCESS EXITING ok: [managed_node3] 13273 1726853285.23434: no more pending results, returning what we have 13273 1726853285.23437: results queue empty 13273 1726853285.23437: checking for any_errors_fatal 13273 1726853285.23439: done checking for any_errors_fatal 13273 1726853285.23439: checking for max_fail_percentage 13273 1726853285.23441: done checking for max_fail_percentage 13273 1726853285.23444: checking to see if all hosts have failed and the running result is not ok 13273 1726853285.23445: done checking to see if all hosts have failed 13273 1726853285.23445: getting the remaining hosts for this loop 13273 1726853285.23446: done getting the remaining hosts for this loop 13273 1726853285.23449: getting the next task for host managed_node3 13273 1726853285.23458: done getting next task for host managed_node3 13273 1726853285.23460: ^ task is: TASK: Check if system is ostree 13273 1726853285.23464: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853285.23467: getting variables 13273 1726853285.23468: in VariableManager get_vars() 13273 1726853285.23567: Calling all_inventory to load vars for managed_node3 13273 1726853285.23570: Calling groups_inventory to load vars for managed_node3 13273 1726853285.23575: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853285.23585: Calling all_plugins_play to load vars for managed_node3 13273 1726853285.23588: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853285.23591: Calling groups_plugins_play to load vars for managed_node3 13273 1726853285.23774: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853285.23956: done with get_vars() 13273 1726853285.24005: done getting variables TASK [Check if system is ostree] *********************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Friday 20 September 2024 13:28:05 -0400 (0:00:00.956) 0:00:03.130 ****** 13273 1726853285.24129: entering _queue_task() for managed_node3/stat 13273 1726853285.24494: worker is 1 (out of 1 available) 13273 1726853285.24523: exiting _queue_task() for managed_node3/stat 13273 1726853285.24556: done queuing things up, now waiting for results queue to drain 13273 1726853285.24558: waiting for pending results... 13273 1726853285.24792: running TaskExecutor() for managed_node3/TASK: Check if system is ostree 13273 1726853285.24854: in run() - task 02083763-bbaf-5fc3-657d-0000000001cf 13273 1726853285.24866: variable 'ansible_search_path' from source: unknown 13273 1726853285.24870: variable 'ansible_search_path' from source: unknown 13273 1726853285.24905: calling self._execute() 13273 1726853285.24968: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853285.24975: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853285.24985: variable 'omit' from source: magic vars 13273 1726853285.25265: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13273 1726853285.25467: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13273 1726853285.25503: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13273 1726853285.25528: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13273 1726853285.25558: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13273 1726853285.25621: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13273 1726853285.25639: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13273 1726853285.25660: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853285.25680: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13273 1726853285.25773: Evaluated conditional (not __network_is_ostree is defined): True 13273 1726853285.25777: variable 'omit' from source: magic vars 13273 1726853285.25805: variable 'omit' from source: magic vars 13273 1726853285.25832: variable 'omit' from source: magic vars 13273 1726853285.25853: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13273 1726853285.25879: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13273 1726853285.25893: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13273 1726853285.25924: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853285.25935: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853285.25957: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13273 1726853285.25960: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853285.25962: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853285.26028: Set connection var ansible_connection to ssh 13273 1726853285.26038: Set connection var ansible_module_compression to ZIP_DEFLATED 13273 1726853285.26041: Set connection var ansible_shell_executable to /bin/sh 13273 1726853285.26046: Set connection var ansible_shell_type to sh 13273 1726853285.26048: Set connection var ansible_pipelining to False 13273 1726853285.26056: Set connection var ansible_timeout to 10 13273 1726853285.26075: variable 'ansible_shell_executable' from source: unknown 13273 1726853285.26078: variable 'ansible_connection' from source: unknown 13273 1726853285.26081: variable 'ansible_module_compression' from source: unknown 13273 1726853285.26083: variable 'ansible_shell_type' from source: unknown 13273 1726853285.26085: variable 'ansible_shell_executable' from source: unknown 13273 1726853285.26089: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853285.26091: variable 'ansible_pipelining' from source: unknown 13273 1726853285.26094: variable 'ansible_timeout' from source: unknown 13273 1726853285.26103: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853285.26198: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 13273 1726853285.26211: variable 'omit' from source: magic vars 13273 1726853285.26216: starting attempt loop 13273 1726853285.26218: running the handler 13273 1726853285.26228: _low_level_execute_command(): starting 13273 1726853285.26235: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13273 1726853285.27011: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853285.27030: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853285.27075: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853285.27147: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 13273 1726853285.29548: stdout chunk (state=3): >>>/root <<< 13273 1726853285.29698: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853285.29762: stderr chunk (state=3): >>><<< 13273 1726853285.29774: stdout chunk (state=3): >>><<< 13273 1726853285.29821: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 13273 1726853285.29839: _low_level_execute_command(): starting 13273 1726853285.29845: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853285.29814-13474-276012714995801 `" && echo ansible-tmp-1726853285.29814-13474-276012714995801="` echo /root/.ansible/tmp/ansible-tmp-1726853285.29814-13474-276012714995801 `" ) && sleep 0' 13273 1726853285.30443: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13273 1726853285.30477: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853285.30483: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13273 1726853285.30485: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853285.30487: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration <<< 13273 1726853285.30506: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13273 1726853285.30519: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853285.30549: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853285.30553: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853285.30555: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853285.30632: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 13273 1726853285.33679: stdout chunk (state=3): >>>ansible-tmp-1726853285.29814-13474-276012714995801=/root/.ansible/tmp/ansible-tmp-1726853285.29814-13474-276012714995801 <<< 13273 1726853285.33978: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853285.33983: stdout chunk (state=3): >>><<< 13273 1726853285.33986: stderr chunk (state=3): >>><<< 13273 1726853285.34178: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853285.29814-13474-276012714995801=/root/.ansible/tmp/ansible-tmp-1726853285.29814-13474-276012714995801 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 13273 1726853285.34182: variable 'ansible_module_compression' from source: unknown 13273 1726853285.34200: ANSIBALLZ: Using lock for stat 13273 1726853285.34210: ANSIBALLZ: Acquiring lock 13273 1726853285.34218: ANSIBALLZ: Lock acquired: 140136094831184 13273 1726853285.34228: ANSIBALLZ: Creating module 13273 1726853285.53724: ANSIBALLZ: Writing module into payload 13273 1726853285.53997: ANSIBALLZ: Writing module 13273 1726853285.54075: ANSIBALLZ: Renaming module 13273 1726853285.54165: ANSIBALLZ: Done creating module 13273 1726853285.54266: variable 'ansible_facts' from source: unknown 13273 1726853285.54478: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853285.29814-13474-276012714995801/AnsiballZ_stat.py 13273 1726853285.54956: Sending initial data 13273 1726853285.54966: Sent initial data (151 bytes) 13273 1726853285.56377: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853285.56381: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853285.56450: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853285.56688: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 13273 1726853285.59032: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13273 1726853285.59099: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13273 1726853285.59159: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13273pukop8ph/tmp70uu4i24 /root/.ansible/tmp/ansible-tmp-1726853285.29814-13474-276012714995801/AnsiballZ_stat.py <<< 13273 1726853285.59178: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853285.29814-13474-276012714995801/AnsiballZ_stat.py" <<< 13273 1726853285.59222: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13273pukop8ph/tmp70uu4i24" to remote "/root/.ansible/tmp/ansible-tmp-1726853285.29814-13474-276012714995801/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853285.29814-13474-276012714995801/AnsiballZ_stat.py" <<< 13273 1726853285.59988: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853285.60153: stderr chunk (state=3): >>><<< 13273 1726853285.60156: stdout chunk (state=3): >>><<< 13273 1726853285.60158: done transferring module to remote 13273 1726853285.60164: _low_level_execute_command(): starting 13273 1726853285.60166: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853285.29814-13474-276012714995801/ /root/.ansible/tmp/ansible-tmp-1726853285.29814-13474-276012714995801/AnsiballZ_stat.py && sleep 0' 13273 1726853285.60700: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13273 1726853285.60750: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853285.60754: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853285.60756: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853285.60806: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853285.60809: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853285.60886: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 13273 1726853285.63598: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853285.63603: stdout chunk (state=3): >>><<< 13273 1726853285.63605: stderr chunk (state=3): >>><<< 13273 1726853285.63711: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 13273 1726853285.63723: _low_level_execute_command(): starting 13273 1726853285.63726: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853285.29814-13474-276012714995801/AnsiballZ_stat.py && sleep 0' 13273 1726853285.64291: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853285.64316: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853285.64319: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 13273 1726853285.64328: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853285.64364: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853285.64418: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 13273 1726853285.67658: stdout chunk (state=3): >>>import _frozen_importlib # frozen<<< 13273 1726853285.67662: stdout chunk (state=3): >>> import _imp # builtin <<< 13273 1726853285.67741: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # <<< 13273 1726853285.67825: stdout chunk (state=3): >>> import '_weakref' # import '_io' # <<< 13273 1726853285.67847: stdout chunk (state=3): >>>import 'marshal' # <<< 13273 1726853285.67904: stdout chunk (state=3): >>>import 'posix' # <<< 13273 1726853285.67969: stdout chunk (state=3): >>>import '_frozen_importlib_external' # <<< 13273 1726853285.67979: stdout chunk (state=3): >>># installing zipimport hook<<< 13273 1726853285.68024: stdout chunk (state=3): >>> <<< 13273 1726853285.68053: stdout chunk (state=3): >>>import 'time' # import 'zipimport' # <<< 13273 1726853285.68147: stdout chunk (state=3): >>># installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py <<< 13273 1726853285.68172: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 13273 1726853285.68188: stdout chunk (state=3): >>>import '_codecs' # <<< 13273 1726853285.68259: stdout chunk (state=3): >>>import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py<<< 13273 1726853285.68266: stdout chunk (state=3): >>> <<< 13273 1726853285.68306: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc'<<< 13273 1726853285.68321: stdout chunk (state=3): >>> import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f74184d0><<< 13273 1726853285.68343: stdout chunk (state=3): >>> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f73e7b30><<< 13273 1726853285.68374: stdout chunk (state=3): >>> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py<<< 13273 1726853285.68388: stdout chunk (state=3): >>> <<< 13273 1726853285.68414: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f741aa50><<< 13273 1726853285.68417: stdout chunk (state=3): >>> <<< 13273 1726853285.68481: stdout chunk (state=3): >>>import '_signal' # import '_abc' # <<< 13273 1726853285.68493: stdout chunk (state=3): >>> <<< 13273 1726853285.68501: stdout chunk (state=3): >>>import 'abc' # <<< 13273 1726853285.68543: stdout chunk (state=3): >>> import 'io' # <<< 13273 1726853285.68592: stdout chunk (state=3): >>>import '_stat' # <<< 13273 1726853285.68684: stdout chunk (state=3): >>> import 'stat' # <<< 13273 1726853285.68738: stdout chunk (state=3): >>>import '_collections_abc' # <<< 13273 1726853285.68794: stdout chunk (state=3): >>>import 'genericpath' # <<< 13273 1726853285.68815: stdout chunk (state=3): >>>import 'posixpath' # <<< 13273 1726853285.68868: stdout chunk (state=3): >>>import 'os' # <<< 13273 1726853285.68897: stdout chunk (state=3): >>>import '_sitebuiltins' # <<< 13273 1726853285.68903: stdout chunk (state=3): >>> <<< 13273 1726853285.68941: stdout chunk (state=3): >>>Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages'<<< 13273 1726853285.68955: stdout chunk (state=3): >>> Adding directory: '/usr/lib/python3.12/site-packages'<<< 13273 1726853285.68982: stdout chunk (state=3): >>> Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth'<<< 13273 1726853285.69017: stdout chunk (state=3): >>> # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py<<< 13273 1726853285.69032: stdout chunk (state=3): >>> <<< 13273 1726853285.69035: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc'<<< 13273 1726853285.69052: stdout chunk (state=3): >>> <<< 13273 1726853285.69153: stdout chunk (state=3): >>>import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f722d130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py<<< 13273 1726853285.69162: stdout chunk (state=3): >>> <<< 13273 1726853285.69186: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc'<<< 13273 1726853285.69189: stdout chunk (state=3): >>> <<< 13273 1726853285.69214: stdout chunk (state=3): >>>import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f722dfa0><<< 13273 1726853285.69220: stdout chunk (state=3): >>> <<< 13273 1726853285.69266: stdout chunk (state=3): >>>import 'site' # <<< 13273 1726853285.69323: stdout chunk (state=3): >>>Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux<<< 13273 1726853285.69386: stdout chunk (state=3): >>> Type "help", "copyright", "credits" or "license" for more information. <<< 13273 1726853285.69714: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 13273 1726853285.69746: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 13273 1726853285.69797: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py <<< 13273 1726853285.69800: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc'<<< 13273 1726853285.69839: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 13273 1726853285.69898: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc'<<< 13273 1726853285.69933: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py<<< 13273 1726853285.69950: stdout chunk (state=3): >>> <<< 13273 1726853285.69990: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f726be90><<< 13273 1726853285.70006: stdout chunk (state=3): >>> <<< 13273 1726853285.70024: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py<<< 13273 1726853285.70057: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 13273 1726853285.70105: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f726bf50><<< 13273 1726853285.70131: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py<<< 13273 1726853285.70173: stdout chunk (state=3): >>> <<< 13273 1726853285.70279: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 13273 1726853285.70301: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 13273 1726853285.70458: stdout chunk (state=3): >>>import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f72a3830> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py <<< 13273 1726853285.70496: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f72a3ec0> import '_collections' # <<< 13273 1726853285.70717: stdout chunk (state=3): >>> import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f7283b60> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f7281280> <<< 13273 1726853285.70820: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f7269040><<< 13273 1726853285.70833: stdout chunk (state=3): >>> <<< 13273 1726853285.70860: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py<<< 13273 1726853285.70911: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' <<< 13273 1726853285.70934: stdout chunk (state=3): >>>import '_sre' # <<< 13273 1726853285.70959: stdout chunk (state=3): >>> # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py<<< 13273 1726853285.71014: stdout chunk (state=3): >>> <<< 13273 1726853285.71081: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc'<<< 13273 1726853285.71116: stdout chunk (state=3): >>> # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py <<< 13273 1726853285.71169: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f72c37d0> <<< 13273 1726853285.71197: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f72c23f0> <<< 13273 1726853285.71238: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py<<< 13273 1726853285.71241: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f7282150> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f72c0c20> <<< 13273 1726853285.71317: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f72f8860><<< 13273 1726853285.71426: stdout chunk (state=3): >>> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f72682c0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py <<< 13273 1726853285.71429: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' <<< 13273 1726853285.71490: stdout chunk (state=3): >>># extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb8f72f8d10><<< 13273 1726853285.71504: stdout chunk (state=3): >>> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f72f8bc0> <<< 13273 1726853285.71518: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so'<<< 13273 1726853285.71537: stdout chunk (state=3): >>> # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so'<<< 13273 1726853285.71569: stdout chunk (state=3): >>> import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb8f72f8f80> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f7266de0><<< 13273 1726853285.71606: stdout chunk (state=3): >>> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py <<< 13273 1726853285.71625: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' <<< 13273 1726853285.71690: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc'<<< 13273 1726853285.71791: stdout chunk (state=3): >>> import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f72f9610> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f72f92e0> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f72fa510> import 'importlib.util' # <<< 13273 1726853285.71803: stdout chunk (state=3): >>>import 'runpy' # <<< 13273 1726853285.71816: stdout chunk (state=3): >>> <<< 13273 1726853285.71840: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py<<< 13273 1726853285.71884: stdout chunk (state=3): >>> <<< 13273 1726853285.71910: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 13273 1726853285.71937: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py<<< 13273 1726853285.71957: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc'<<< 13273 1726853285.71968: stdout chunk (state=3): >>> import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f7310710> <<< 13273 1726853285.72021: stdout chunk (state=3): >>>import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' <<< 13273 1726853285.72040: stdout chunk (state=3): >>># extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so'<<< 13273 1726853285.72073: stdout chunk (state=3): >>> import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb8f7311df0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py<<< 13273 1726853285.72085: stdout chunk (state=3): >>> <<< 13273 1726853285.72108: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' <<< 13273 1726853285.72143: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc'<<< 13273 1726853285.72167: stdout chunk (state=3): >>> import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f7312c90> <<< 13273 1726853285.72225: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so'<<< 13273 1726853285.72240: stdout chunk (state=3): >>> import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb8f73132f0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f73121e0><<< 13273 1726853285.72267: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py<<< 13273 1726853285.72273: stdout chunk (state=3): >>> <<< 13273 1726853285.72298: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc'<<< 13273 1726853285.72302: stdout chunk (state=3): >>> <<< 13273 1726853285.72365: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so'<<< 13273 1726853285.72394: stdout chunk (state=3): >>> import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb8f7313d70> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f73134a0><<< 13273 1726853285.72400: stdout chunk (state=3): >>> <<< 13273 1726853285.72486: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f72fa540> <<< 13273 1726853285.72492: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py<<< 13273 1726853285.72543: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc'<<< 13273 1726853285.72546: stdout chunk (state=3): >>> <<< 13273 1726853285.72577: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py<<< 13273 1726853285.72583: stdout chunk (state=3): >>> <<< 13273 1726853285.72613: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc'<<< 13273 1726853285.72668: stdout chunk (state=3): >>> # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so'<<< 13273 1726853285.72674: stdout chunk (state=3): >>> <<< 13273 1726853285.72689: stdout chunk (state=3): >>># extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' <<< 13273 1726853285.72728: stdout chunk (state=3): >>>import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb8f70d3c50> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py <<< 13273 1726853285.72778: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so'<<< 13273 1726853285.72795: stdout chunk (state=3): >>> # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' <<< 13273 1726853285.72815: stdout chunk (state=3): >>>import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb8f70fc710> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f70fc470> <<< 13273 1726853285.72895: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb8f70fc740> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py<<< 13273 1726853285.72897: stdout chunk (state=3): >>> <<< 13273 1726853285.72915: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 13273 1726853285.73018: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so'<<< 13273 1726853285.73096: stdout chunk (state=3): >>> <<< 13273 1726853285.73249: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 13273 1726853285.73260: stdout chunk (state=3): >>>import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb8f70fd070> <<< 13273 1726853285.73487: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' <<< 13273 1726853285.73524: stdout chunk (state=3): >>># extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' <<< 13273 1726853285.73561: stdout chunk (state=3): >>>import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb8f70fda30> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f70fc920> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f70d1df0> <<< 13273 1726853285.73634: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc'<<< 13273 1726853285.73655: stdout chunk (state=3): >>> <<< 13273 1726853285.73687: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py<<< 13273 1726853285.73690: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc'<<< 13273 1726853285.73707: stdout chunk (state=3): >>> <<< 13273 1726853285.73750: stdout chunk (state=3): >>>import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f70fee10> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f70fdb50><<< 13273 1726853285.73755: stdout chunk (state=3): >>> <<< 13273 1726853285.73785: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f72fac30><<< 13273 1726853285.73828: stdout chunk (state=3): >>> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 13273 1726853285.73927: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 13273 1726853285.74009: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc'<<< 13273 1726853285.74015: stdout chunk (state=3): >>> <<< 13273 1726853285.74056: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f71231a0> <<< 13273 1726853285.74139: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 13273 1726853285.74170: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc'<<< 13273 1726853285.74203: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py<<< 13273 1726853285.74208: stdout chunk (state=3): >>> <<< 13273 1726853285.74241: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc'<<< 13273 1726853285.74287: stdout chunk (state=3): >>> <<< 13273 1726853285.74309: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f714b560><<< 13273 1726853285.74314: stdout chunk (state=3): >>> <<< 13273 1726853285.74351: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py<<< 13273 1726853285.74360: stdout chunk (state=3): >>> <<< 13273 1726853285.74484: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 13273 1726853285.74516: stdout chunk (state=3): >>>import 'ntpath' # <<< 13273 1726853285.74566: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py <<< 13273 1726853285.74585: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' <<< 13273 1726853285.74591: stdout chunk (state=3): >>>import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f71ac2f0> <<< 13273 1726853285.74664: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc'<<< 13273 1726853285.74672: stdout chunk (state=3): >>> <<< 13273 1726853285.74713: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py<<< 13273 1726853285.74716: stdout chunk (state=3): >>> <<< 13273 1726853285.74780: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc'<<< 13273 1726853285.74927: stdout chunk (state=3): >>> import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f71aea50> <<< 13273 1726853285.75055: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f71ac410><<< 13273 1726853285.75060: stdout chunk (state=3): >>> <<< 13273 1726853285.75124: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f716d310> <<< 13273 1726853285.75168: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py <<< 13273 1726853285.75198: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' <<< 13273 1726853285.75208: stdout chunk (state=3): >>>import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f6fad430> <<< 13273 1726853285.75242: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f714a360> <<< 13273 1726853285.75265: stdout chunk (state=3): >>>import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f70ffd70> <<< 13273 1726853285.75454: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc'<<< 13273 1726853285.75459: stdout chunk (state=3): >>> <<< 13273 1726853285.75495: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fb8f6fad6d0><<< 13273 1726853285.75583: stdout chunk (state=3): >>> <<< 13273 1726853285.75700: stdout chunk (state=3): >>># zipimport: found 30 names in '/tmp/ansible_stat_payload_ud7qiteg/ansible_stat_payload.zip' <<< 13273 1726853285.75885: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853285.75967: stdout chunk (state=3): >>># zipimport: zlib available<<< 13273 1726853285.75972: stdout chunk (state=3): >>> <<< 13273 1726853285.76019: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py<<< 13273 1726853285.76025: stdout chunk (state=3): >>> <<< 13273 1726853285.76058: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 13273 1726853285.76138: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 13273 1726853285.76336: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 13273 1726853285.76395: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py <<< 13273 1726853285.76407: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' <<< 13273 1726853285.76427: stdout chunk (state=3): >>>import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f7003170> <<< 13273 1726853285.76458: stdout chunk (state=3): >>>import '_typing' # <<< 13273 1726853285.76463: stdout chunk (state=3): >>> <<< 13273 1726853285.76816: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f6fe2060> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f6fe11f0> # zipimport: zlib available import 'ansible' # <<< 13273 1726853285.76857: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 13273 1726853285.76900: stdout chunk (state=3): >>>import 'ansible.module_utils' # # zipimport: zlib available<<< 13273 1726853285.77081: stdout chunk (state=3): >>> <<< 13273 1726853285.79165: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853285.81114: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py <<< 13273 1726853285.81131: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' <<< 13273 1726853285.81150: stdout chunk (state=3): >>>import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f7001010> <<< 13273 1726853285.81193: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py <<< 13273 1726853285.81202: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 13273 1726853285.81239: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py <<< 13273 1726853285.81256: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 13273 1726853285.81296: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 13273 1726853285.81328: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' <<< 13273 1726853285.81389: stdout chunk (state=3): >>># extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb8f702aa50> <<< 13273 1726853285.81403: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f702a7e0> <<< 13273 1726853285.81451: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f702a0f0> <<< 13273 1726853285.81477: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py <<< 13273 1726853285.81595: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f702ab40> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f7003b90> import 'atexit' # <<< 13273 1726853285.81608: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' <<< 13273 1726853285.81617: stdout chunk (state=3): >>># extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb8f702b740> <<< 13273 1726853285.81648: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' <<< 13273 1726853285.81667: stdout chunk (state=3): >>># extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb8f702b980> <<< 13273 1726853285.81704: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 13273 1726853285.81794: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 13273 1726853285.81827: stdout chunk (state=3): >>>import '_locale' # <<< 13273 1726853285.81881: stdout chunk (state=3): >>> <<< 13273 1726853285.81914: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f702bec0> <<< 13273 1726853285.81944: stdout chunk (state=3): >>>import 'pwd' # <<< 13273 1726853285.81953: stdout chunk (state=3): >>> <<< 13273 1726853285.82031: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 13273 1726853285.82093: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f690dca0> <<< 13273 1726853285.82145: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so'<<< 13273 1726853285.82152: stdout chunk (state=3): >>> # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so'<<< 13273 1726853285.82199: stdout chunk (state=3): >>> import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb8f690f890> <<< 13273 1726853285.82202: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py<<< 13273 1726853285.82205: stdout chunk (state=3): >>> <<< 13273 1726853285.82291: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f6910290><<< 13273 1726853285.82294: stdout chunk (state=3): >>> <<< 13273 1726853285.82322: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py<<< 13273 1726853285.82331: stdout chunk (state=3): >>> <<< 13273 1726853285.82385: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc'<<< 13273 1726853285.82418: stdout chunk (state=3): >>> import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f6911430><<< 13273 1726853285.82423: stdout chunk (state=3): >>> <<< 13273 1726853285.82458: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py<<< 13273 1726853285.82522: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc'<<< 13273 1726853285.82527: stdout chunk (state=3): >>> <<< 13273 1726853285.82559: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py<<< 13273 1726853285.82578: stdout chunk (state=3): >>> <<< 13273 1726853285.82581: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc'<<< 13273 1726853285.82598: stdout chunk (state=3): >>> <<< 13273 1726853285.82683: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f6913ef0><<< 13273 1726853285.82744: stdout chunk (state=3): >>> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so'<<< 13273 1726853285.82759: stdout chunk (state=3): >>> <<< 13273 1726853285.82765: stdout chunk (state=3): >>># extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' <<< 13273 1726853285.82791: stdout chunk (state=3): >>>import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb8f7312c00> <<< 13273 1726853285.82854: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f69121e0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py<<< 13273 1726853285.82858: stdout chunk (state=3): >>> <<< 13273 1726853285.82911: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc'<<< 13273 1726853285.82949: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py<<< 13273 1726853285.82954: stdout chunk (state=3): >>> <<< 13273 1726853285.82981: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' <<< 13273 1726853285.83002: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py<<< 13273 1726853285.83059: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc'<<< 13273 1726853285.83061: stdout chunk (state=3): >>> <<< 13273 1726853285.83092: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py<<< 13273 1726853285.83105: stdout chunk (state=3): >>> <<< 13273 1726853285.83118: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' <<< 13273 1726853285.83136: stdout chunk (state=3): >>>import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f691be00><<< 13273 1726853285.83164: stdout chunk (state=3): >>> import '_tokenize' # <<< 13273 1726853285.83290: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f691a8d0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f691a630> <<< 13273 1726853285.83296: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py<<< 13273 1726853285.83319: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 13273 1726853285.83480: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f691aba0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f69126f0><<< 13273 1726853285.83485: stdout chunk (state=3): >>> <<< 13273 1726853285.83522: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' <<< 13273 1726853285.83542: stdout chunk (state=3): >>># extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so'<<< 13273 1726853285.83585: stdout chunk (state=3): >>> import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb8f6963a40> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py<<< 13273 1726853285.83595: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc'<<< 13273 1726853285.83606: stdout chunk (state=3): >>> import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f6964140><<< 13273 1726853285.83638: stdout chunk (state=3): >>> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py<<< 13273 1726853285.83646: stdout chunk (state=3): >>> <<< 13273 1726853285.83670: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc'<<< 13273 1726853285.83715: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py<<< 13273 1726853285.83718: stdout chunk (state=3): >>> <<< 13273 1726853285.83732: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc'<<< 13273 1726853285.83791: stdout chunk (state=3): >>> # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so'<<< 13273 1726853285.83808: stdout chunk (state=3): >>> <<< 13273 1726853285.83811: stdout chunk (state=3): >>># extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so'<<< 13273 1726853285.83829: stdout chunk (state=3): >>> <<< 13273 1726853285.83832: stdout chunk (state=3): >>>import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb8f6965be0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f69659a0><<< 13273 1726853285.83870: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 13273 1726853285.84076: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc'<<< 13273 1726853285.84148: stdout chunk (state=3): >>> # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so'<<< 13273 1726853285.84164: stdout chunk (state=3): >>> <<< 13273 1726853285.84167: stdout chunk (state=3): >>># extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so'<<< 13273 1726853285.84184: stdout chunk (state=3): >>> import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb8f6968110><<< 13273 1726853285.84188: stdout chunk (state=3): >>> <<< 13273 1726853285.84205: stdout chunk (state=3): >>>import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f69662d0><<< 13273 1726853285.84245: stdout chunk (state=3): >>> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 13273 1726853285.84323: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 13273 1726853285.84373: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' <<< 13273 1726853285.84463: stdout chunk (state=3): >>>import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f696b8f0> <<< 13273 1726853285.84680: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f69682c0><<< 13273 1726853285.84687: stdout chunk (state=3): >>> <<< 13273 1726853285.84770: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' <<< 13273 1726853285.84793: stdout chunk (state=3): >>># extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' <<< 13273 1726853285.84858: stdout chunk (state=3): >>>import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb8f696c650> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' <<< 13273 1726853285.84868: stdout chunk (state=3): >>># extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' <<< 13273 1726853285.84944: stdout chunk (state=3): >>>import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb8f696c9e0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' <<< 13273 1726853285.84973: stdout chunk (state=3): >>># extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' <<< 13273 1726853285.84985: stdout chunk (state=3): >>>import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb8f696cc20> <<< 13273 1726853285.85020: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f6964260><<< 13273 1726853285.85057: stdout chunk (state=3): >>> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py<<< 13273 1726853285.85062: stdout chunk (state=3): >>> <<< 13273 1726853285.85075: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc'<<< 13273 1726853285.85114: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py<<< 13273 1726853285.85120: stdout chunk (state=3): >>> <<< 13273 1726853285.85177: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc'<<< 13273 1726853285.85270: stdout chunk (state=3): >>> # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 13273 1726853285.85281: stdout chunk (state=3): >>>import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb8f69f4410><<< 13273 1726853285.85386: stdout chunk (state=3): >>> <<< 13273 1726853285.85539: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so'<<< 13273 1726853285.85549: stdout chunk (state=3): >>> <<< 13273 1726853285.85564: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so'<<< 13273 1726853285.85587: stdout chunk (state=3): >>> import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb8f69f55e0><<< 13273 1726853285.85590: stdout chunk (state=3): >>> <<< 13273 1726853285.85618: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f696eba0><<< 13273 1726853285.85621: stdout chunk (state=3): >>> <<< 13273 1726853285.85663: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so'<<< 13273 1726853285.85670: stdout chunk (state=3): >>> <<< 13273 1726853285.85683: stdout chunk (state=3): >>># extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so'<<< 13273 1726853285.85694: stdout chunk (state=3): >>> import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb8f696ff20> <<< 13273 1726853285.85705: stdout chunk (state=3): >>>import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f696e7b0> <<< 13273 1726853285.85755: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available<<< 13273 1726853285.85763: stdout chunk (state=3): >>> import 'ansible.module_utils.compat' # <<< 13273 1726853285.85799: stdout chunk (state=3): >>> # zipimport: zlib available <<< 13273 1726853285.86081: stdout chunk (state=3): >>># zipimport: zlib available<<< 13273 1726853285.86105: stdout chunk (state=3): >>> # zipimport: zlib available <<< 13273 1726853285.86116: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853285.86132: stdout chunk (state=3): >>>import 'ansible.module_utils.common' # <<< 13273 1726853285.86137: stdout chunk (state=3): >>> <<< 13273 1726853285.86165: stdout chunk (state=3): >>># zipimport: zlib available<<< 13273 1726853285.86257: stdout chunk (state=3): >>> <<< 13273 1726853285.86264: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.text' # <<< 13273 1726853285.86288: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853285.86466: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853285.86645: stdout chunk (state=3): >>># zipimport: zlib available<<< 13273 1726853285.86687: stdout chunk (state=3): >>> <<< 13273 1726853285.87623: stdout chunk (state=3): >>># zipimport: zlib available<<< 13273 1726853285.87792: stdout chunk (state=3): >>> <<< 13273 1726853285.88540: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # <<< 13273 1726853285.88555: stdout chunk (state=3): >>> <<< 13273 1726853285.88611: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # <<< 13273 1726853285.88702: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # <<< 13273 1726853285.88707: stdout chunk (state=3): >>> <<< 13273 1726853285.88724: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 13273 1726853285.88802: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so'<<< 13273 1726853285.88810: stdout chunk (state=3): >>> <<< 13273 1726853285.88822: stdout chunk (state=3): >>># extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so'<<< 13273 1726853285.88828: stdout chunk (state=3): >>> import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb8f69fd880> <<< 13273 1726853285.88990: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py <<< 13273 1726853285.89068: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 13273 1726853285.89075: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f69fe660> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f69f5700><<< 13273 1726853285.89162: stdout chunk (state=3): >>> import 'ansible.module_utils.compat.selinux' # <<< 13273 1726853285.89208: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853285.89224: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853285.89264: stdout chunk (state=3): >>>import 'ansible.module_utils._text' # <<< 13273 1726853285.89297: stdout chunk (state=3): >>> # zipimport: zlib available <<< 13273 1726853285.89535: stdout chunk (state=3): >>># zipimport: zlib available<<< 13273 1726853285.89591: stdout chunk (state=3): >>> <<< 13273 1726853285.89790: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py <<< 13273 1726853285.89810: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f69fe360> <<< 13273 1726853285.89834: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853285.90614: stdout chunk (state=3): >>># zipimport: zlib available<<< 13273 1726853285.90976: stdout chunk (state=3): >>> <<< 13273 1726853285.91363: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853285.91460: stdout chunk (state=3): >>># zipimport: zlib available<<< 13273 1726853285.91481: stdout chunk (state=3): >>> <<< 13273 1726853285.91599: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 13273 1726853285.91620: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853285.91689: stdout chunk (state=3): >>># zipimport: zlib available<<< 13273 1726853285.91722: stdout chunk (state=3): >>> <<< 13273 1726853285.91760: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # # zipimport: zlib available <<< 13273 1726853285.91875: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853285.91995: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 13273 1726853285.92020: stdout chunk (state=3): >>> <<< 13273 1726853285.92050: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available<<< 13273 1726853285.92082: stdout chunk (state=3): >>> import 'ansible.module_utils.parsing' # <<< 13273 1726853285.92102: stdout chunk (state=3): >>># zipimport: zlib available<<< 13273 1726853285.92123: stdout chunk (state=3): >>> <<< 13273 1726853285.92173: stdout chunk (state=3): >>># zipimport: zlib available<<< 13273 1726853285.92196: stdout chunk (state=3): >>> <<< 13273 1726853285.92248: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available<<< 13273 1726853285.92390: stdout chunk (state=3): >>> <<< 13273 1726853285.92632: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853285.93010: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py<<< 13273 1726853285.93015: stdout chunk (state=3): >>> <<< 13273 1726853285.93115: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc'<<< 13273 1726853285.93122: stdout chunk (state=3): >>> <<< 13273 1726853285.93153: stdout chunk (state=3): >>>import '_ast' # <<< 13273 1726853285.93164: stdout chunk (state=3): >>> <<< 13273 1726853285.93270: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f69ff800><<< 13273 1726853285.93285: stdout chunk (state=3): >>> <<< 13273 1726853285.93483: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 13273 1726853285.93511: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # <<< 13273 1726853285.93533: stdout chunk (state=3): >>>import 'ansible.module_utils.common.validation' # <<< 13273 1726853285.93559: stdout chunk (state=3): >>>import 'ansible.module_utils.common.parameters' # <<< 13273 1726853285.93586: stdout chunk (state=3): >>>import 'ansible.module_utils.common.arg_spec' # <<< 13273 1726853285.93675: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available<<< 13273 1726853285.93680: stdout chunk (state=3): >>> <<< 13273 1726853285.93732: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 13273 1726853285.93741: stdout chunk (state=3): >>> <<< 13273 1726853285.93762: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853285.93826: stdout chunk (state=3): >>># zipimport: zlib available<<< 13273 1726853285.93896: stdout chunk (state=3): >>> # zipimport: zlib available <<< 13273 1726853285.93989: stdout chunk (state=3): >>># zipimport: zlib available<<< 13273 1726853285.93997: stdout chunk (state=3): >>> <<< 13273 1726853285.94097: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py<<< 13273 1726853285.94102: stdout chunk (state=3): >>> <<< 13273 1726853285.94173: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc'<<< 13273 1726853285.94178: stdout chunk (state=3): >>> <<< 13273 1726853285.94312: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so'<<< 13273 1726853285.94381: stdout chunk (state=3): >>> import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb8f680a180> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f6805130><<< 13273 1726853285.94384: stdout chunk (state=3): >>> <<< 13273 1726853285.94425: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # <<< 13273 1726853285.94429: stdout chunk (state=3): >>> <<< 13273 1726853285.94442: stdout chunk (state=3): >>>import 'ansible.module_utils.common.process' # <<< 13273 1726853285.94464: stdout chunk (state=3): >>> # zipimport: zlib available <<< 13273 1726853285.94666: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 13273 1726853285.94718: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853285.94786: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py <<< 13273 1726853285.94807: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 13273 1726853285.94852: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 13273 1726853285.94897: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 13273 1726853285.94938: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py<<< 13273 1726853285.95038: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc'<<< 13273 1726853285.95044: stdout chunk (state=3): >>> <<< 13273 1726853285.95077: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py<<< 13273 1726853285.95080: stdout chunk (state=3): >>> <<< 13273 1726853285.95124: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc'<<< 13273 1726853285.95127: stdout chunk (state=3): >>> <<< 13273 1726853285.95263: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f7086b10> <<< 13273 1726853285.95350: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f7076810> <<< 13273 1726853285.95532: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f680a330> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f69ff170> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available<<< 13273 1726853285.95535: stdout chunk (state=3): >>> # zipimport: zlib available <<< 13273 1726853285.95616: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # <<< 13273 1726853285.95630: stdout chunk (state=3): >>>import 'ansible.module_utils.common.sys_info' # <<< 13273 1726853285.95707: stdout chunk (state=3): >>> import 'ansible.module_utils.basic' # <<< 13273 1726853285.95779: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # <<< 13273 1726853285.95901: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853285.96021: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853285.96389: stdout chunk (state=3): >>># zipimport: zlib available <<< 13273 1726853285.96483: stdout chunk (state=3): >>> <<< 13273 1726853285.96504: stdout chunk (state=3): >>>{"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} <<< 13273 1726853285.96532: stdout chunk (state=3): >>># destroy __main__<<< 13273 1726853285.96578: stdout chunk (state=3): >>> <<< 13273 1726853285.97085: stdout chunk (state=3): >>># clear sys.path_importer_cache<<< 13273 1726853285.97107: stdout chunk (state=3): >>> # clear sys.path_hooks # clear builtins._ # clear sys.path<<< 13273 1726853285.97137: stdout chunk (state=3): >>> # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref <<< 13273 1726853285.97162: stdout chunk (state=3): >>># cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io<<< 13273 1726853285.97214: stdout chunk (state=3): >>> # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack <<< 13273 1726853285.97220: stdout chunk (state=3): >>># destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools<<< 13273 1726853285.97258: stdout chunk (state=3): >>> # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct<<< 13273 1726853285.97266: stdout chunk (state=3): >>> # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util<<< 13273 1726853285.97288: stdout chunk (state=3): >>> # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect<<< 13273 1726853285.97315: stdout chunk (state=3): >>> # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib<<< 13273 1726853285.97352: stdout chunk (state=3): >>> # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing <<< 13273 1726853285.97365: stdout chunk (state=3): >>># cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json<<< 13273 1726853285.97397: stdout chunk (state=3): >>> # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token<<< 13273 1726853285.97417: stdout chunk (state=3): >>> # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid<<< 13273 1726853285.97459: stdout chunk (state=3): >>> # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon<<< 13273 1726853285.97472: stdout chunk (state=3): >>> # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc<<< 13273 1726853285.97497: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text<<< 13273 1726853285.97517: stdout chunk (state=3): >>> # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool<<< 13273 1726853285.97570: stdout chunk (state=3): >>> # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters<<< 13273 1726853285.97575: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file <<< 13273 1726853285.97607: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules<<< 13273 1726853285.97791: stdout chunk (state=3): >>> <<< 13273 1726853285.98069: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 13273 1726853285.98093: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util <<< 13273 1726853285.98132: stdout chunk (state=3): >>># destroy _bz2 # destroy _compression<<< 13273 1726853285.98166: stdout chunk (state=3): >>> # destroy _lzma<<< 13273 1726853285.98201: stdout chunk (state=3): >>> # destroy _blake2 <<< 13273 1726853285.98226: stdout chunk (state=3): >>># destroy binascii <<< 13273 1726853285.98256: stdout chunk (state=3): >>># destroy struct # destroy zlib # destroy bz2<<< 13273 1726853285.98267: stdout chunk (state=3): >>> # destroy lzma # destroy zipfile._path<<< 13273 1726853285.98315: stdout chunk (state=3): >>> # destroy zipfile<<< 13273 1726853285.98344: stdout chunk (state=3): >>> # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress<<< 13273 1726853285.98400: stdout chunk (state=3): >>> # destroy ntpath <<< 13273 1726853285.98403: stdout chunk (state=3): >>># destroy importlib<<< 13273 1726853285.98421: stdout chunk (state=3): >>> # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux<<< 13273 1726853285.98466: stdout chunk (state=3): >>> # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale<<< 13273 1726853285.98516: stdout chunk (state=3): >>> # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal<<< 13273 1726853285.98520: stdout chunk (state=3): >>> # destroy _posixsubprocess # destroy syslog<<< 13273 1726853285.98998: stdout chunk (state=3): >>> # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants <<< 13273 1726853285.99002: stdout chunk (state=3): >>># destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8<<< 13273 1726853285.99041: stdout chunk (state=3): >>> # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref<<< 13273 1726853285.99045: stdout chunk (state=3): >>> # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys<<< 13273 1726853285.99062: stdout chunk (state=3): >>> # cleanup[3] wiping builtins<<< 13273 1726853285.99081: stdout chunk (state=3): >>> # destroy selinux._selinux<<< 13273 1726853285.99093: stdout chunk (state=3): >>> # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader<<< 13273 1726853285.99188: stdout chunk (state=3): >>> # destroy systemd._journal # destroy _datetime <<< 13273 1726853285.99435: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket # destroy _collections <<< 13273 1726853285.99464: stdout chunk (state=3): >>># destroy platform<<< 13273 1726853285.99548: stdout chunk (state=3): >>> # destroy _uuid # destroy stat<<< 13273 1726853285.99567: stdout chunk (state=3): >>> # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib <<< 13273 1726853285.99623: stdout chunk (state=3): >>># destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse<<< 13273 1726853285.99666: stdout chunk (state=3): >>> # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external<<< 13273 1726853285.99698: stdout chunk (state=3): >>> # destroy _imp # destroy _io # destroy marshal <<< 13273 1726853285.99769: stdout chunk (state=3): >>># clear sys.meta_path <<< 13273 1726853285.99856: stdout chunk (state=3): >>># clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases <<< 13273 1726853285.99913: stdout chunk (state=3): >>># destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs<<< 13273 1726853285.99950: stdout chunk (state=3): >>> # destroy io <<< 13273 1726853286.00011: stdout chunk (state=3): >>># destroy traceback <<< 13273 1726853286.00028: stdout chunk (state=3): >>># destroy warnings # destroy weakref <<< 13273 1726853286.00041: stdout chunk (state=3): >>># destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time <<< 13273 1726853286.00146: stdout chunk (state=3): >>># destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _string # destroy re<<< 13273 1726853286.00160: stdout chunk (state=3): >>> # destroy itertools <<< 13273 1726853286.00199: stdout chunk (state=3): >>># destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread<<< 13273 1726853286.00219: stdout chunk (state=3): >>> # clear sys.audit hooks<<< 13273 1726853286.00288: stdout chunk (state=3): >>> <<< 13273 1726853286.00775: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 13273 1726853286.00778: stdout chunk (state=3): >>><<< 13273 1726853286.00780: stderr chunk (state=3): >>><<< 13273 1726853286.00957: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f74184d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f73e7b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f741aa50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f722d130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f722dfa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f726be90> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f726bf50> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f72a3830> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f72a3ec0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f7283b60> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f7281280> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f7269040> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f72c37d0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f72c23f0> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f7282150> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f72c0c20> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f72f8860> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f72682c0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb8f72f8d10> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f72f8bc0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb8f72f8f80> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f7266de0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f72f9610> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f72f92e0> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f72fa510> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f7310710> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb8f7311df0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f7312c90> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb8f73132f0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f73121e0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb8f7313d70> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f73134a0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f72fa540> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb8f70d3c50> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb8f70fc710> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f70fc470> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb8f70fc740> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb8f70fd070> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb8f70fda30> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f70fc920> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f70d1df0> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f70fee10> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f70fdb50> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f72fac30> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f71231a0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f714b560> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f71ac2f0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f71aea50> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f71ac410> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f716d310> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f6fad430> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f714a360> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f70ffd70> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fb8f6fad6d0> # zipimport: found 30 names in '/tmp/ansible_stat_payload_ud7qiteg/ansible_stat_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f7003170> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f6fe2060> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f6fe11f0> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f7001010> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb8f702aa50> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f702a7e0> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f702a0f0> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f702ab40> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f7003b90> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb8f702b740> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb8f702b980> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f702bec0> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f690dca0> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb8f690f890> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f6910290> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f6911430> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f6913ef0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb8f7312c00> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f69121e0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f691be00> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f691a8d0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f691a630> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f691aba0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f69126f0> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb8f6963a40> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f6964140> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb8f6965be0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f69659a0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb8f6968110> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f69662d0> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f696b8f0> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f69682c0> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb8f696c650> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb8f696c9e0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb8f696cc20> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f6964260> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb8f69f4410> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb8f69f55e0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f696eba0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb8f696ff20> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f696e7b0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb8f69fd880> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f69fe660> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f69f5700> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f69fe360> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f69ff800> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb8f680a180> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f6805130> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f7086b10> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f7076810> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f680a330> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb8f69ff170> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. [WARNING]: Module invocation had junk after the JSON data: # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 13273 1726853286.01655: done with _execute_module (stat, {'path': '/run/ostree-booted', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853285.29814-13474-276012714995801/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13273 1726853286.01658: _low_level_execute_command(): starting 13273 1726853286.01660: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853285.29814-13474-276012714995801/ > /dev/null 2>&1 && sleep 0' 13273 1726853286.02472: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853286.02477: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853286.02479: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration <<< 13273 1726853286.02481: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853286.02484: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853286.02768: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853286.02773: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853286.02791: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853286.02868: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 13273 1726853286.05695: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853286.05700: stdout chunk (state=3): >>><<< 13273 1726853286.05703: stderr chunk (state=3): >>><<< 13273 1726853286.05706: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 13273 1726853286.05709: handler run complete 13273 1726853286.05711: attempt loop complete, returning result 13273 1726853286.05714: _execute() done 13273 1726853286.05716: dumping result to json 13273 1726853286.05719: done dumping result, returning 13273 1726853286.05721: done running TaskExecutor() for managed_node3/TASK: Check if system is ostree [02083763-bbaf-5fc3-657d-0000000001cf] 13273 1726853286.05724: sending task result for task 02083763-bbaf-5fc3-657d-0000000001cf 13273 1726853286.05860: done sending task result for task 02083763-bbaf-5fc3-657d-0000000001cf 13273 1726853286.05864: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "stat": { "exists": false } } 13273 1726853286.05933: no more pending results, returning what we have 13273 1726853286.05936: results queue empty 13273 1726853286.05937: checking for any_errors_fatal 13273 1726853286.05947: done checking for any_errors_fatal 13273 1726853286.05948: checking for max_fail_percentage 13273 1726853286.05949: done checking for max_fail_percentage 13273 1726853286.05950: checking to see if all hosts have failed and the running result is not ok 13273 1726853286.05951: done checking to see if all hosts have failed 13273 1726853286.05951: getting the remaining hosts for this loop 13273 1726853286.05953: done getting the remaining hosts for this loop 13273 1726853286.05956: getting the next task for host managed_node3 13273 1726853286.05962: done getting next task for host managed_node3 13273 1726853286.05964: ^ task is: TASK: Set flag to indicate system is ostree 13273 1726853286.05967: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853286.05970: getting variables 13273 1726853286.06176: in VariableManager get_vars() 13273 1726853286.06208: Calling all_inventory to load vars for managed_node3 13273 1726853286.06211: Calling groups_inventory to load vars for managed_node3 13273 1726853286.06215: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853286.06228: Calling all_plugins_play to load vars for managed_node3 13273 1726853286.06231: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853286.06234: Calling groups_plugins_play to load vars for managed_node3 13273 1726853286.06519: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853286.06948: done with get_vars() 13273 1726853286.06961: done getting variables 13273 1726853286.07120: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Set flag to indicate system is ostree] *********************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:22 Friday 20 September 2024 13:28:06 -0400 (0:00:00.831) 0:00:03.961 ****** 13273 1726853286.07283: entering _queue_task() for managed_node3/set_fact 13273 1726853286.07285: Creating lock for set_fact 13273 1726853286.08013: worker is 1 (out of 1 available) 13273 1726853286.08025: exiting _queue_task() for managed_node3/set_fact 13273 1726853286.08037: done queuing things up, now waiting for results queue to drain 13273 1726853286.08039: waiting for pending results... 13273 1726853286.08780: running TaskExecutor() for managed_node3/TASK: Set flag to indicate system is ostree 13273 1726853286.08785: in run() - task 02083763-bbaf-5fc3-657d-0000000001d0 13273 1726853286.08788: variable 'ansible_search_path' from source: unknown 13273 1726853286.08791: variable 'ansible_search_path' from source: unknown 13273 1726853286.08794: calling self._execute() 13273 1726853286.08945: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853286.08994: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853286.09177: variable 'omit' from source: magic vars 13273 1726853286.09938: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13273 1726853286.10310: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13273 1726853286.10370: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13273 1726853286.10408: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13273 1726853286.10447: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13273 1726853286.10535: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13273 1726853286.10577: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13273 1726853286.10604: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853286.10630: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13273 1726853286.10759: Evaluated conditional (not __network_is_ostree is defined): True 13273 1726853286.10770: variable 'omit' from source: magic vars 13273 1726853286.10821: variable 'omit' from source: magic vars 13273 1726853286.10952: variable '__ostree_booted_stat' from source: set_fact 13273 1726853286.11076: variable 'omit' from source: magic vars 13273 1726853286.11079: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13273 1726853286.11082: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13273 1726853286.11098: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13273 1726853286.11130: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853286.11148: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853286.11221: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13273 1726853286.11224: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853286.11227: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853286.11299: Set connection var ansible_connection to ssh 13273 1726853286.11314: Set connection var ansible_module_compression to ZIP_DEFLATED 13273 1726853286.11337: Set connection var ansible_shell_executable to /bin/sh 13273 1726853286.11433: Set connection var ansible_shell_type to sh 13273 1726853286.11436: Set connection var ansible_pipelining to False 13273 1726853286.11439: Set connection var ansible_timeout to 10 13273 1726853286.11440: variable 'ansible_shell_executable' from source: unknown 13273 1726853286.11445: variable 'ansible_connection' from source: unknown 13273 1726853286.11447: variable 'ansible_module_compression' from source: unknown 13273 1726853286.11449: variable 'ansible_shell_type' from source: unknown 13273 1726853286.11450: variable 'ansible_shell_executable' from source: unknown 13273 1726853286.11452: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853286.11454: variable 'ansible_pipelining' from source: unknown 13273 1726853286.11455: variable 'ansible_timeout' from source: unknown 13273 1726853286.11457: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853286.11578: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13273 1726853286.11581: variable 'omit' from source: magic vars 13273 1726853286.11583: starting attempt loop 13273 1726853286.11585: running the handler 13273 1726853286.11587: handler run complete 13273 1726853286.11596: attempt loop complete, returning result 13273 1726853286.11601: _execute() done 13273 1726853286.11606: dumping result to json 13273 1726853286.11611: done dumping result, returning 13273 1726853286.11650: done running TaskExecutor() for managed_node3/TASK: Set flag to indicate system is ostree [02083763-bbaf-5fc3-657d-0000000001d0] 13273 1726853286.11653: sending task result for task 02083763-bbaf-5fc3-657d-0000000001d0 13273 1726853286.11877: done sending task result for task 02083763-bbaf-5fc3-657d-0000000001d0 13273 1726853286.11880: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "__network_is_ostree": false }, "changed": false } 13273 1726853286.11967: no more pending results, returning what we have 13273 1726853286.11973: results queue empty 13273 1726853286.11974: checking for any_errors_fatal 13273 1726853286.11980: done checking for any_errors_fatal 13273 1726853286.11981: checking for max_fail_percentage 13273 1726853286.11983: done checking for max_fail_percentage 13273 1726853286.11984: checking to see if all hosts have failed and the running result is not ok 13273 1726853286.11984: done checking to see if all hosts have failed 13273 1726853286.11985: getting the remaining hosts for this loop 13273 1726853286.11986: done getting the remaining hosts for this loop 13273 1726853286.12045: getting the next task for host managed_node3 13273 1726853286.12055: done getting next task for host managed_node3 13273 1726853286.12058: ^ task is: TASK: Fix CentOS6 Base repo 13273 1726853286.12061: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853286.12066: getting variables 13273 1726853286.12067: in VariableManager get_vars() 13273 1726853286.12104: Calling all_inventory to load vars for managed_node3 13273 1726853286.12107: Calling groups_inventory to load vars for managed_node3 13273 1726853286.12111: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853286.12121: Calling all_plugins_play to load vars for managed_node3 13273 1726853286.12124: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853286.12132: Calling groups_plugins_play to load vars for managed_node3 13273 1726853286.12728: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853286.13178: done with get_vars() 13273 1726853286.13188: done getting variables 13273 1726853286.13487: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Fix CentOS6 Base repo] *************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:26 Friday 20 September 2024 13:28:06 -0400 (0:00:00.062) 0:00:04.024 ****** 13273 1726853286.13516: entering _queue_task() for managed_node3/copy 13273 1726853286.14115: worker is 1 (out of 1 available) 13273 1726853286.14126: exiting _queue_task() for managed_node3/copy 13273 1726853286.14178: done queuing things up, now waiting for results queue to drain 13273 1726853286.14179: waiting for pending results... 13273 1726853286.14489: running TaskExecutor() for managed_node3/TASK: Fix CentOS6 Base repo 13273 1726853286.14494: in run() - task 02083763-bbaf-5fc3-657d-0000000001d2 13273 1726853286.14499: variable 'ansible_search_path' from source: unknown 13273 1726853286.14508: variable 'ansible_search_path' from source: unknown 13273 1726853286.14560: calling self._execute() 13273 1726853286.14654: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853286.14667: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853286.14756: variable 'omit' from source: magic vars 13273 1726853286.15095: variable 'ansible_distribution' from source: facts 13273 1726853286.15121: Evaluated conditional (ansible_distribution == 'CentOS'): True 13273 1726853286.15257: variable 'ansible_distribution_major_version' from source: facts 13273 1726853286.15269: Evaluated conditional (ansible_distribution_major_version == '6'): False 13273 1726853286.15280: when evaluation is False, skipping this task 13273 1726853286.15287: _execute() done 13273 1726853286.15307: dumping result to json 13273 1726853286.15313: done dumping result, returning 13273 1726853286.15323: done running TaskExecutor() for managed_node3/TASK: Fix CentOS6 Base repo [02083763-bbaf-5fc3-657d-0000000001d2] 13273 1726853286.15331: sending task result for task 02083763-bbaf-5fc3-657d-0000000001d2 skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 13273 1726853286.15806: no more pending results, returning what we have 13273 1726853286.15809: results queue empty 13273 1726853286.15810: checking for any_errors_fatal 13273 1726853286.15815: done checking for any_errors_fatal 13273 1726853286.15816: checking for max_fail_percentage 13273 1726853286.15817: done checking for max_fail_percentage 13273 1726853286.15818: checking to see if all hosts have failed and the running result is not ok 13273 1726853286.15820: done checking to see if all hosts have failed 13273 1726853286.15821: getting the remaining hosts for this loop 13273 1726853286.15822: done getting the remaining hosts for this loop 13273 1726853286.15825: getting the next task for host managed_node3 13273 1726853286.15830: done getting next task for host managed_node3 13273 1726853286.15833: ^ task is: TASK: Include the task 'enable_epel.yml' 13273 1726853286.15836: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853286.15847: getting variables 13273 1726853286.15849: in VariableManager get_vars() 13273 1726853286.15880: Calling all_inventory to load vars for managed_node3 13273 1726853286.15883: Calling groups_inventory to load vars for managed_node3 13273 1726853286.15886: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853286.15895: Calling all_plugins_play to load vars for managed_node3 13273 1726853286.15898: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853286.15901: Calling groups_plugins_play to load vars for managed_node3 13273 1726853286.16341: done sending task result for task 02083763-bbaf-5fc3-657d-0000000001d2 13273 1726853286.16347: WORKER PROCESS EXITING 13273 1726853286.16429: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853286.16896: done with get_vars() 13273 1726853286.16907: done getting variables TASK [Include the task 'enable_epel.yml'] ************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:51 Friday 20 September 2024 13:28:06 -0400 (0:00:00.035) 0:00:04.060 ****** 13273 1726853286.17117: entering _queue_task() for managed_node3/include_tasks 13273 1726853286.17593: worker is 1 (out of 1 available) 13273 1726853286.17604: exiting _queue_task() for managed_node3/include_tasks 13273 1726853286.17614: done queuing things up, now waiting for results queue to drain 13273 1726853286.17615: waiting for pending results... 13273 1726853286.18189: running TaskExecutor() for managed_node3/TASK: Include the task 'enable_epel.yml' 13273 1726853286.18226: in run() - task 02083763-bbaf-5fc3-657d-0000000001d3 13273 1726853286.18279: variable 'ansible_search_path' from source: unknown 13273 1726853286.18286: variable 'ansible_search_path' from source: unknown 13273 1726853286.18310: calling self._execute() 13273 1726853286.18402: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853286.18477: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853286.18480: variable 'omit' from source: magic vars 13273 1726853286.18970: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13273 1726853286.23098: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13273 1726853286.23278: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13273 1726853286.23379: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13273 1726853286.23429: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13273 1726853286.23483: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13273 1726853286.23612: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853286.23646: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853286.23680: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853286.23868: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853286.23878: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853286.23881: variable '__network_is_ostree' from source: set_fact 13273 1726853286.23890: Evaluated conditional (not __network_is_ostree | d(false)): True 13273 1726853286.23899: _execute() done 13273 1726853286.23906: dumping result to json 13273 1726853286.23912: done dumping result, returning 13273 1726853286.23921: done running TaskExecutor() for managed_node3/TASK: Include the task 'enable_epel.yml' [02083763-bbaf-5fc3-657d-0000000001d3] 13273 1726853286.23928: sending task result for task 02083763-bbaf-5fc3-657d-0000000001d3 13273 1726853286.24062: no more pending results, returning what we have 13273 1726853286.24067: in VariableManager get_vars() 13273 1726853286.24111: Calling all_inventory to load vars for managed_node3 13273 1726853286.24113: Calling groups_inventory to load vars for managed_node3 13273 1726853286.24117: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853286.24128: Calling all_plugins_play to load vars for managed_node3 13273 1726853286.24130: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853286.24132: Calling groups_plugins_play to load vars for managed_node3 13273 1726853286.24414: done sending task result for task 02083763-bbaf-5fc3-657d-0000000001d3 13273 1726853286.24417: WORKER PROCESS EXITING 13273 1726853286.24436: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853286.24663: done with get_vars() 13273 1726853286.24673: variable 'ansible_search_path' from source: unknown 13273 1726853286.24674: variable 'ansible_search_path' from source: unknown 13273 1726853286.24729: we have included files to process 13273 1726853286.24731: generating all_blocks data 13273 1726853286.24732: done generating all_blocks data 13273 1726853286.24738: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 13273 1726853286.24740: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 13273 1726853286.24745: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 13273 1726853286.26176: done processing included file 13273 1726853286.26179: iterating over new_blocks loaded from include file 13273 1726853286.26180: in VariableManager get_vars() 13273 1726853286.26203: done with get_vars() 13273 1726853286.26205: filtering new block on tags 13273 1726853286.26229: done filtering new block on tags 13273 1726853286.26232: in VariableManager get_vars() 13273 1726853286.26242: done with get_vars() 13273 1726853286.26243: filtering new block on tags 13273 1726853286.26262: done filtering new block on tags 13273 1726853286.26264: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml for managed_node3 13273 1726853286.26270: extending task lists for all hosts with included blocks 13273 1726853286.26403: done extending task lists 13273 1726853286.26405: done processing included files 13273 1726853286.26406: results queue empty 13273 1726853286.26406: checking for any_errors_fatal 13273 1726853286.26410: done checking for any_errors_fatal 13273 1726853286.26411: checking for max_fail_percentage 13273 1726853286.26412: done checking for max_fail_percentage 13273 1726853286.26412: checking to see if all hosts have failed and the running result is not ok 13273 1726853286.26413: done checking to see if all hosts have failed 13273 1726853286.26414: getting the remaining hosts for this loop 13273 1726853286.26415: done getting the remaining hosts for this loop 13273 1726853286.26432: getting the next task for host managed_node3 13273 1726853286.26437: done getting next task for host managed_node3 13273 1726853286.26440: ^ task is: TASK: Create EPEL {{ ansible_distribution_major_version }} 13273 1726853286.26442: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853286.26445: getting variables 13273 1726853286.26446: in VariableManager get_vars() 13273 1726853286.26454: Calling all_inventory to load vars for managed_node3 13273 1726853286.26456: Calling groups_inventory to load vars for managed_node3 13273 1726853286.26459: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853286.26464: Calling all_plugins_play to load vars for managed_node3 13273 1726853286.26470: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853286.26483: Calling groups_plugins_play to load vars for managed_node3 13273 1726853286.26665: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853286.26888: done with get_vars() 13273 1726853286.26896: done getting variables 13273 1726853286.26985: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) 13273 1726853286.27120: variable 'ansible_distribution_major_version' from source: facts TASK [Create EPEL 10] ********************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:8 Friday 20 September 2024 13:28:06 -0400 (0:00:00.100) 0:00:04.160 ****** 13273 1726853286.27181: entering _queue_task() for managed_node3/command 13273 1726853286.27183: Creating lock for command 13273 1726853286.27537: worker is 1 (out of 1 available) 13273 1726853286.27547: exiting _queue_task() for managed_node3/command 13273 1726853286.27559: done queuing things up, now waiting for results queue to drain 13273 1726853286.27560: waiting for pending results... 13273 1726853286.27817: running TaskExecutor() for managed_node3/TASK: Create EPEL 10 13273 1726853286.27923: in run() - task 02083763-bbaf-5fc3-657d-0000000001ed 13273 1726853286.27939: variable 'ansible_search_path' from source: unknown 13273 1726853286.27949: variable 'ansible_search_path' from source: unknown 13273 1726853286.27989: calling self._execute() 13273 1726853286.28069: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853286.28082: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853286.28096: variable 'omit' from source: magic vars 13273 1726853286.28486: variable 'ansible_distribution' from source: facts 13273 1726853286.28503: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 13273 1726853286.28635: variable 'ansible_distribution_major_version' from source: facts 13273 1726853286.28655: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 13273 1726853286.28663: when evaluation is False, skipping this task 13273 1726853286.28670: _execute() done 13273 1726853286.28680: dumping result to json 13273 1726853286.28688: done dumping result, returning 13273 1726853286.28699: done running TaskExecutor() for managed_node3/TASK: Create EPEL 10 [02083763-bbaf-5fc3-657d-0000000001ed] 13273 1726853286.28708: sending task result for task 02083763-bbaf-5fc3-657d-0000000001ed 13273 1726853286.28918: done sending task result for task 02083763-bbaf-5fc3-657d-0000000001ed 13273 1726853286.28922: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 13273 1726853286.28984: no more pending results, returning what we have 13273 1726853286.28988: results queue empty 13273 1726853286.28989: checking for any_errors_fatal 13273 1726853286.28990: done checking for any_errors_fatal 13273 1726853286.28991: checking for max_fail_percentage 13273 1726853286.28993: done checking for max_fail_percentage 13273 1726853286.28994: checking to see if all hosts have failed and the running result is not ok 13273 1726853286.28994: done checking to see if all hosts have failed 13273 1726853286.28995: getting the remaining hosts for this loop 13273 1726853286.28997: done getting the remaining hosts for this loop 13273 1726853286.29000: getting the next task for host managed_node3 13273 1726853286.29006: done getting next task for host managed_node3 13273 1726853286.29008: ^ task is: TASK: Install yum-utils package 13273 1726853286.29013: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853286.29017: getting variables 13273 1726853286.29019: in VariableManager get_vars() 13273 1726853286.29052: Calling all_inventory to load vars for managed_node3 13273 1726853286.29054: Calling groups_inventory to load vars for managed_node3 13273 1726853286.29058: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853286.29070: Calling all_plugins_play to load vars for managed_node3 13273 1726853286.29074: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853286.29082: Calling groups_plugins_play to load vars for managed_node3 13273 1726853286.29786: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853286.30179: done with get_vars() 13273 1726853286.30190: done getting variables 13273 1726853286.30285: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Install yum-utils package] *********************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:26 Friday 20 September 2024 13:28:06 -0400 (0:00:00.031) 0:00:04.192 ****** 13273 1726853286.30315: entering _queue_task() for managed_node3/package 13273 1726853286.30317: Creating lock for package 13273 1726853286.30854: worker is 1 (out of 1 available) 13273 1726853286.30867: exiting _queue_task() for managed_node3/package 13273 1726853286.30881: done queuing things up, now waiting for results queue to drain 13273 1726853286.30882: waiting for pending results... 13273 1726853286.31181: running TaskExecutor() for managed_node3/TASK: Install yum-utils package 13273 1726853286.31311: in run() - task 02083763-bbaf-5fc3-657d-0000000001ee 13273 1726853286.31329: variable 'ansible_search_path' from source: unknown 13273 1726853286.31336: variable 'ansible_search_path' from source: unknown 13273 1726853286.31380: calling self._execute() 13273 1726853286.31578: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853286.31582: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853286.31585: variable 'omit' from source: magic vars 13273 1726853286.32485: variable 'ansible_distribution' from source: facts 13273 1726853286.32503: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 13273 1726853286.32717: variable 'ansible_distribution_major_version' from source: facts 13273 1726853286.32728: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 13273 1726853286.32738: when evaluation is False, skipping this task 13273 1726853286.32745: _execute() done 13273 1726853286.32751: dumping result to json 13273 1726853286.32757: done dumping result, returning 13273 1726853286.32768: done running TaskExecutor() for managed_node3/TASK: Install yum-utils package [02083763-bbaf-5fc3-657d-0000000001ee] 13273 1726853286.32779: sending task result for task 02083763-bbaf-5fc3-657d-0000000001ee 13273 1726853286.33091: done sending task result for task 02083763-bbaf-5fc3-657d-0000000001ee 13273 1726853286.33094: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 13273 1726853286.33133: no more pending results, returning what we have 13273 1726853286.33136: results queue empty 13273 1726853286.33137: checking for any_errors_fatal 13273 1726853286.33142: done checking for any_errors_fatal 13273 1726853286.33143: checking for max_fail_percentage 13273 1726853286.33144: done checking for max_fail_percentage 13273 1726853286.33145: checking to see if all hosts have failed and the running result is not ok 13273 1726853286.33146: done checking to see if all hosts have failed 13273 1726853286.33146: getting the remaining hosts for this loop 13273 1726853286.33147: done getting the remaining hosts for this loop 13273 1726853286.33151: getting the next task for host managed_node3 13273 1726853286.33157: done getting next task for host managed_node3 13273 1726853286.33159: ^ task is: TASK: Enable EPEL 7 13273 1726853286.33163: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853286.33167: getting variables 13273 1726853286.33168: in VariableManager get_vars() 13273 1726853286.33199: Calling all_inventory to load vars for managed_node3 13273 1726853286.33202: Calling groups_inventory to load vars for managed_node3 13273 1726853286.33206: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853286.33216: Calling all_plugins_play to load vars for managed_node3 13273 1726853286.33219: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853286.33222: Calling groups_plugins_play to load vars for managed_node3 13273 1726853286.33514: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853286.33711: done with get_vars() 13273 1726853286.33720: done getting variables 13273 1726853286.33786: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 7] *********************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:32 Friday 20 September 2024 13:28:06 -0400 (0:00:00.035) 0:00:04.227 ****** 13273 1726853286.33821: entering _queue_task() for managed_node3/command 13273 1726853286.34099: worker is 1 (out of 1 available) 13273 1726853286.34111: exiting _queue_task() for managed_node3/command 13273 1726853286.34124: done queuing things up, now waiting for results queue to drain 13273 1726853286.34125: waiting for pending results... 13273 1726853286.34404: running TaskExecutor() for managed_node3/TASK: Enable EPEL 7 13273 1726853286.34517: in run() - task 02083763-bbaf-5fc3-657d-0000000001ef 13273 1726853286.34537: variable 'ansible_search_path' from source: unknown 13273 1726853286.34546: variable 'ansible_search_path' from source: unknown 13273 1726853286.34598: calling self._execute() 13273 1726853286.34678: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853286.34691: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853286.34713: variable 'omit' from source: magic vars 13273 1726853286.35140: variable 'ansible_distribution' from source: facts 13273 1726853286.35156: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 13273 1726853286.35477: variable 'ansible_distribution_major_version' from source: facts 13273 1726853286.35480: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 13273 1726853286.35483: when evaluation is False, skipping this task 13273 1726853286.35486: _execute() done 13273 1726853286.35488: dumping result to json 13273 1726853286.35490: done dumping result, returning 13273 1726853286.35493: done running TaskExecutor() for managed_node3/TASK: Enable EPEL 7 [02083763-bbaf-5fc3-657d-0000000001ef] 13273 1726853286.35495: sending task result for task 02083763-bbaf-5fc3-657d-0000000001ef 13273 1726853286.35563: done sending task result for task 02083763-bbaf-5fc3-657d-0000000001ef 13273 1726853286.35566: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 13273 1726853286.35615: no more pending results, returning what we have 13273 1726853286.35618: results queue empty 13273 1726853286.35619: checking for any_errors_fatal 13273 1726853286.35624: done checking for any_errors_fatal 13273 1726853286.35624: checking for max_fail_percentage 13273 1726853286.35626: done checking for max_fail_percentage 13273 1726853286.35626: checking to see if all hosts have failed and the running result is not ok 13273 1726853286.35627: done checking to see if all hosts have failed 13273 1726853286.35628: getting the remaining hosts for this loop 13273 1726853286.35629: done getting the remaining hosts for this loop 13273 1726853286.35631: getting the next task for host managed_node3 13273 1726853286.35637: done getting next task for host managed_node3 13273 1726853286.35639: ^ task is: TASK: Enable EPEL 8 13273 1726853286.35643: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853286.35646: getting variables 13273 1726853286.35648: in VariableManager get_vars() 13273 1726853286.35754: Calling all_inventory to load vars for managed_node3 13273 1726853286.35758: Calling groups_inventory to load vars for managed_node3 13273 1726853286.35762: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853286.35775: Calling all_plugins_play to load vars for managed_node3 13273 1726853286.35778: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853286.35781: Calling groups_plugins_play to load vars for managed_node3 13273 1726853286.36054: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853286.36284: done with get_vars() 13273 1726853286.36294: done getting variables 13273 1726853286.36360: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 8] *********************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:37 Friday 20 September 2024 13:28:06 -0400 (0:00:00.025) 0:00:04.253 ****** 13273 1726853286.36394: entering _queue_task() for managed_node3/command 13273 1726853286.36642: worker is 1 (out of 1 available) 13273 1726853286.36653: exiting _queue_task() for managed_node3/command 13273 1726853286.36664: done queuing things up, now waiting for results queue to drain 13273 1726853286.36665: waiting for pending results... 13273 1726853286.36997: running TaskExecutor() for managed_node3/TASK: Enable EPEL 8 13273 1726853286.37076: in run() - task 02083763-bbaf-5fc3-657d-0000000001f0 13273 1726853286.37080: variable 'ansible_search_path' from source: unknown 13273 1726853286.37082: variable 'ansible_search_path' from source: unknown 13273 1726853286.37113: calling self._execute() 13273 1726853286.37276: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853286.37280: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853286.37283: variable 'omit' from source: magic vars 13273 1726853286.37606: variable 'ansible_distribution' from source: facts 13273 1726853286.37627: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 13273 1726853286.37749: variable 'ansible_distribution_major_version' from source: facts 13273 1726853286.37763: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 13273 1726853286.37777: when evaluation is False, skipping this task 13273 1726853286.37784: _execute() done 13273 1726853286.37789: dumping result to json 13273 1726853286.37796: done dumping result, returning 13273 1726853286.37805: done running TaskExecutor() for managed_node3/TASK: Enable EPEL 8 [02083763-bbaf-5fc3-657d-0000000001f0] 13273 1726853286.37832: sending task result for task 02083763-bbaf-5fc3-657d-0000000001f0 skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 13273 1726853286.38042: no more pending results, returning what we have 13273 1726853286.38046: results queue empty 13273 1726853286.38046: checking for any_errors_fatal 13273 1726853286.38053: done checking for any_errors_fatal 13273 1726853286.38054: checking for max_fail_percentage 13273 1726853286.38056: done checking for max_fail_percentage 13273 1726853286.38057: checking to see if all hosts have failed and the running result is not ok 13273 1726853286.38058: done checking to see if all hosts have failed 13273 1726853286.38058: getting the remaining hosts for this loop 13273 1726853286.38059: done getting the remaining hosts for this loop 13273 1726853286.38063: getting the next task for host managed_node3 13273 1726853286.38237: done getting next task for host managed_node3 13273 1726853286.38240: ^ task is: TASK: Enable EPEL 6 13273 1726853286.38244: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853286.38247: getting variables 13273 1726853286.38249: in VariableManager get_vars() 13273 1726853286.38275: Calling all_inventory to load vars for managed_node3 13273 1726853286.38277: Calling groups_inventory to load vars for managed_node3 13273 1726853286.38280: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853286.38286: done sending task result for task 02083763-bbaf-5fc3-657d-0000000001f0 13273 1726853286.38288: WORKER PROCESS EXITING 13273 1726853286.38295: Calling all_plugins_play to load vars for managed_node3 13273 1726853286.38299: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853286.38302: Calling groups_plugins_play to load vars for managed_node3 13273 1726853286.38452: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853286.38842: done with get_vars() 13273 1726853286.38852: done getting variables 13273 1726853286.38909: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 6] *********************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:42 Friday 20 September 2024 13:28:06 -0400 (0:00:00.025) 0:00:04.278 ****** 13273 1726853286.38936: entering _queue_task() for managed_node3/copy 13273 1726853286.39521: worker is 1 (out of 1 available) 13273 1726853286.39533: exiting _queue_task() for managed_node3/copy 13273 1726853286.39544: done queuing things up, now waiting for results queue to drain 13273 1726853286.39545: waiting for pending results... 13273 1726853286.40002: running TaskExecutor() for managed_node3/TASK: Enable EPEL 6 13273 1726853286.40433: in run() - task 02083763-bbaf-5fc3-657d-0000000001f2 13273 1726853286.40437: variable 'ansible_search_path' from source: unknown 13273 1726853286.40439: variable 'ansible_search_path' from source: unknown 13273 1726853286.40468: calling self._execute() 13273 1726853286.40561: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853286.40579: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853286.40613: variable 'omit' from source: magic vars 13273 1726853286.41180: variable 'ansible_distribution' from source: facts 13273 1726853286.41210: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 13273 1726853286.41420: variable 'ansible_distribution_major_version' from source: facts 13273 1726853286.41423: Evaluated conditional (ansible_distribution_major_version == '6'): False 13273 1726853286.41426: when evaluation is False, skipping this task 13273 1726853286.41428: _execute() done 13273 1726853286.41431: dumping result to json 13273 1726853286.41433: done dumping result, returning 13273 1726853286.41435: done running TaskExecutor() for managed_node3/TASK: Enable EPEL 6 [02083763-bbaf-5fc3-657d-0000000001f2] 13273 1726853286.41437: sending task result for task 02083763-bbaf-5fc3-657d-0000000001f2 13273 1726853286.41512: done sending task result for task 02083763-bbaf-5fc3-657d-0000000001f2 13273 1726853286.41578: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 13273 1726853286.41626: no more pending results, returning what we have 13273 1726853286.41630: results queue empty 13273 1726853286.41631: checking for any_errors_fatal 13273 1726853286.41636: done checking for any_errors_fatal 13273 1726853286.41636: checking for max_fail_percentage 13273 1726853286.41638: done checking for max_fail_percentage 13273 1726853286.41639: checking to see if all hosts have failed and the running result is not ok 13273 1726853286.41639: done checking to see if all hosts have failed 13273 1726853286.41640: getting the remaining hosts for this loop 13273 1726853286.41641: done getting the remaining hosts for this loop 13273 1726853286.41644: getting the next task for host managed_node3 13273 1726853286.41652: done getting next task for host managed_node3 13273 1726853286.41655: ^ task is: TASK: Set network provider to 'nm' 13273 1726853286.41657: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853286.41661: getting variables 13273 1726853286.41663: in VariableManager get_vars() 13273 1726853286.41695: Calling all_inventory to load vars for managed_node3 13273 1726853286.41698: Calling groups_inventory to load vars for managed_node3 13273 1726853286.41701: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853286.41712: Calling all_plugins_play to load vars for managed_node3 13273 1726853286.41714: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853286.41717: Calling groups_plugins_play to load vars for managed_node3 13273 1726853286.42019: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853286.42239: done with get_vars() 13273 1726853286.42248: done getting variables 13273 1726853286.42305: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set network provider to 'nm'] ******************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_removal_nm.yml:13 Friday 20 September 2024 13:28:06 -0400 (0:00:00.033) 0:00:04.312 ****** 13273 1726853286.42332: entering _queue_task() for managed_node3/set_fact 13273 1726853286.42788: worker is 1 (out of 1 available) 13273 1726853286.42796: exiting _queue_task() for managed_node3/set_fact 13273 1726853286.42805: done queuing things up, now waiting for results queue to drain 13273 1726853286.42806: waiting for pending results... 13273 1726853286.42865: running TaskExecutor() for managed_node3/TASK: Set network provider to 'nm' 13273 1726853286.42977: in run() - task 02083763-bbaf-5fc3-657d-000000000007 13273 1726853286.42983: variable 'ansible_search_path' from source: unknown 13273 1726853286.43137: calling self._execute() 13273 1726853286.43141: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853286.43143: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853286.43146: variable 'omit' from source: magic vars 13273 1726853286.43237: variable 'omit' from source: magic vars 13273 1726853286.43275: variable 'omit' from source: magic vars 13273 1726853286.43314: variable 'omit' from source: magic vars 13273 1726853286.43362: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13273 1726853286.43405: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13273 1726853286.43432: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13273 1726853286.43462: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853286.43572: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853286.43575: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13273 1726853286.43578: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853286.43580: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853286.43617: Set connection var ansible_connection to ssh 13273 1726853286.43632: Set connection var ansible_module_compression to ZIP_DEFLATED 13273 1726853286.43642: Set connection var ansible_shell_executable to /bin/sh 13273 1726853286.43648: Set connection var ansible_shell_type to sh 13273 1726853286.43658: Set connection var ansible_pipelining to False 13273 1726853286.43667: Set connection var ansible_timeout to 10 13273 1726853286.43702: variable 'ansible_shell_executable' from source: unknown 13273 1726853286.43710: variable 'ansible_connection' from source: unknown 13273 1726853286.43718: variable 'ansible_module_compression' from source: unknown 13273 1726853286.43724: variable 'ansible_shell_type' from source: unknown 13273 1726853286.43730: variable 'ansible_shell_executable' from source: unknown 13273 1726853286.43737: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853286.43745: variable 'ansible_pipelining' from source: unknown 13273 1726853286.43752: variable 'ansible_timeout' from source: unknown 13273 1726853286.43759: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853286.43922: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13273 1726853286.43936: variable 'omit' from source: magic vars 13273 1726853286.43944: starting attempt loop 13273 1726853286.43950: running the handler 13273 1726853286.44132: handler run complete 13273 1726853286.44149: attempt loop complete, returning result 13273 1726853286.44174: _execute() done 13273 1726853286.44184: dumping result to json 13273 1726853286.44192: done dumping result, returning 13273 1726853286.44224: done running TaskExecutor() for managed_node3/TASK: Set network provider to 'nm' [02083763-bbaf-5fc3-657d-000000000007] 13273 1726853286.44228: sending task result for task 02083763-bbaf-5fc3-657d-000000000007 ok: [managed_node3] => { "ansible_facts": { "network_provider": "nm" }, "changed": false } 13273 1726853286.44382: no more pending results, returning what we have 13273 1726853286.44385: results queue empty 13273 1726853286.44385: checking for any_errors_fatal 13273 1726853286.44389: done checking for any_errors_fatal 13273 1726853286.44390: checking for max_fail_percentage 13273 1726853286.44391: done checking for max_fail_percentage 13273 1726853286.44392: checking to see if all hosts have failed and the running result is not ok 13273 1726853286.44393: done checking to see if all hosts have failed 13273 1726853286.44393: getting the remaining hosts for this loop 13273 1726853286.44395: done getting the remaining hosts for this loop 13273 1726853286.44397: getting the next task for host managed_node3 13273 1726853286.44403: done getting next task for host managed_node3 13273 1726853286.44405: ^ task is: TASK: meta (flush_handlers) 13273 1726853286.44407: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853286.44410: getting variables 13273 1726853286.44412: in VariableManager get_vars() 13273 1726853286.44443: Calling all_inventory to load vars for managed_node3 13273 1726853286.44446: Calling groups_inventory to load vars for managed_node3 13273 1726853286.44449: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853286.44461: Calling all_plugins_play to load vars for managed_node3 13273 1726853286.44465: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853286.44468: Calling groups_plugins_play to load vars for managed_node3 13273 1726853286.45035: done sending task result for task 02083763-bbaf-5fc3-657d-000000000007 13273 1726853286.45037: WORKER PROCESS EXITING 13273 1726853286.45058: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853286.45364: done with get_vars() 13273 1726853286.45376: done getting variables 13273 1726853286.45609: in VariableManager get_vars() 13273 1726853286.45662: Calling all_inventory to load vars for managed_node3 13273 1726853286.45664: Calling groups_inventory to load vars for managed_node3 13273 1726853286.45716: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853286.45721: Calling all_plugins_play to load vars for managed_node3 13273 1726853286.45724: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853286.45727: Calling groups_plugins_play to load vars for managed_node3 13273 1726853286.46221: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853286.46482: done with get_vars() 13273 1726853286.46501: done queuing things up, now waiting for results queue to drain 13273 1726853286.46503: results queue empty 13273 1726853286.46504: checking for any_errors_fatal 13273 1726853286.46506: done checking for any_errors_fatal 13273 1726853286.46507: checking for max_fail_percentage 13273 1726853286.46512: done checking for max_fail_percentage 13273 1726853286.46513: checking to see if all hosts have failed and the running result is not ok 13273 1726853286.46519: done checking to see if all hosts have failed 13273 1726853286.46520: getting the remaining hosts for this loop 13273 1726853286.46521: done getting the remaining hosts for this loop 13273 1726853286.46523: getting the next task for host managed_node3 13273 1726853286.46528: done getting next task for host managed_node3 13273 1726853286.46530: ^ task is: TASK: meta (flush_handlers) 13273 1726853286.46531: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853286.46539: getting variables 13273 1726853286.46540: in VariableManager get_vars() 13273 1726853286.46547: Calling all_inventory to load vars for managed_node3 13273 1726853286.46550: Calling groups_inventory to load vars for managed_node3 13273 1726853286.46552: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853286.46556: Calling all_plugins_play to load vars for managed_node3 13273 1726853286.46559: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853286.46562: Calling groups_plugins_play to load vars for managed_node3 13273 1726853286.47015: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853286.47253: done with get_vars() 13273 1726853286.47260: done getting variables 13273 1726853286.47512: in VariableManager get_vars() 13273 1726853286.47524: Calling all_inventory to load vars for managed_node3 13273 1726853286.47527: Calling groups_inventory to load vars for managed_node3 13273 1726853286.47529: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853286.47533: Calling all_plugins_play to load vars for managed_node3 13273 1726853286.47535: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853286.47538: Calling groups_plugins_play to load vars for managed_node3 13273 1726853286.47811: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853286.48528: done with get_vars() 13273 1726853286.48540: done queuing things up, now waiting for results queue to drain 13273 1726853286.48541: results queue empty 13273 1726853286.48542: checking for any_errors_fatal 13273 1726853286.48543: done checking for any_errors_fatal 13273 1726853286.48544: checking for max_fail_percentage 13273 1726853286.48545: done checking for max_fail_percentage 13273 1726853286.48545: checking to see if all hosts have failed and the running result is not ok 13273 1726853286.48546: done checking to see if all hosts have failed 13273 1726853286.48547: getting the remaining hosts for this loop 13273 1726853286.48550: done getting the remaining hosts for this loop 13273 1726853286.48581: getting the next task for host managed_node3 13273 1726853286.48584: done getting next task for host managed_node3 13273 1726853286.48585: ^ task is: None 13273 1726853286.48586: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853286.48594: done queuing things up, now waiting for results queue to drain 13273 1726853286.48595: results queue empty 13273 1726853286.48596: checking for any_errors_fatal 13273 1726853286.48597: done checking for any_errors_fatal 13273 1726853286.48597: checking for max_fail_percentage 13273 1726853286.48598: done checking for max_fail_percentage 13273 1726853286.48599: checking to see if all hosts have failed and the running result is not ok 13273 1726853286.48600: done checking to see if all hosts have failed 13273 1726853286.48601: getting the next task for host managed_node3 13273 1726853286.48604: done getting next task for host managed_node3 13273 1726853286.48605: ^ task is: None 13273 1726853286.48606: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853286.48794: in VariableManager get_vars() 13273 1726853286.48830: done with get_vars() 13273 1726853286.48836: in VariableManager get_vars() 13273 1726853286.48856: done with get_vars() 13273 1726853286.48860: variable 'omit' from source: magic vars 13273 1726853286.48893: in VariableManager get_vars() 13273 1726853286.48915: done with get_vars() 13273 1726853286.48935: variable 'omit' from source: magic vars PLAY [Play for testing bond removal] ******************************************* 13273 1726853286.50512: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 13273 1726853286.50654: getting the remaining hosts for this loop 13273 1726853286.50655: done getting the remaining hosts for this loop 13273 1726853286.50658: getting the next task for host managed_node3 13273 1726853286.50660: done getting next task for host managed_node3 13273 1726853286.50662: ^ task is: TASK: Gathering Facts 13273 1726853286.50663: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853286.50665: getting variables 13273 1726853286.50666: in VariableManager get_vars() 13273 1726853286.50720: Calling all_inventory to load vars for managed_node3 13273 1726853286.50723: Calling groups_inventory to load vars for managed_node3 13273 1726853286.50725: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853286.50730: Calling all_plugins_play to load vars for managed_node3 13273 1726853286.50743: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853286.50747: Calling groups_plugins_play to load vars for managed_node3 13273 1726853286.51077: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853286.51299: done with get_vars() 13273 1726853286.51307: done getting variables 13273 1726853286.51349: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:3 Friday 20 September 2024 13:28:06 -0400 (0:00:00.090) 0:00:04.402 ****** 13273 1726853286.51381: entering _queue_task() for managed_node3/gather_facts 13273 1726853286.51840: worker is 1 (out of 1 available) 13273 1726853286.51858: exiting _queue_task() for managed_node3/gather_facts 13273 1726853286.51869: done queuing things up, now waiting for results queue to drain 13273 1726853286.51918: waiting for pending results... 13273 1726853286.52250: running TaskExecutor() for managed_node3/TASK: Gathering Facts 13273 1726853286.52322: in run() - task 02083763-bbaf-5fc3-657d-000000000218 13273 1726853286.52357: variable 'ansible_search_path' from source: unknown 13273 1726853286.52455: calling self._execute() 13273 1726853286.52532: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853286.52546: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853286.52567: variable 'omit' from source: magic vars 13273 1726853286.53008: variable 'ansible_distribution_major_version' from source: facts 13273 1726853286.53025: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853286.53045: variable 'omit' from source: magic vars 13273 1726853286.53090: variable 'omit' from source: magic vars 13273 1726853286.53154: variable 'omit' from source: magic vars 13273 1726853286.53308: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13273 1726853286.53312: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13273 1726853286.53314: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13273 1726853286.53328: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853286.53347: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853286.53553: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13273 1726853286.53562: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853286.53570: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853286.53703: Set connection var ansible_connection to ssh 13273 1726853286.53720: Set connection var ansible_module_compression to ZIP_DEFLATED 13273 1726853286.53730: Set connection var ansible_shell_executable to /bin/sh 13273 1726853286.53737: Set connection var ansible_shell_type to sh 13273 1726853286.53752: Set connection var ansible_pipelining to False 13273 1726853286.53777: Set connection var ansible_timeout to 10 13273 1726853286.53810: variable 'ansible_shell_executable' from source: unknown 13273 1726853286.53820: variable 'ansible_connection' from source: unknown 13273 1726853286.53828: variable 'ansible_module_compression' from source: unknown 13273 1726853286.53835: variable 'ansible_shell_type' from source: unknown 13273 1726853286.53841: variable 'ansible_shell_executable' from source: unknown 13273 1726853286.53870: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853286.53874: variable 'ansible_pipelining' from source: unknown 13273 1726853286.53876: variable 'ansible_timeout' from source: unknown 13273 1726853286.53881: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853286.54090: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13273 1726853286.54095: variable 'omit' from source: magic vars 13273 1726853286.54107: starting attempt loop 13273 1726853286.54176: running the handler 13273 1726853286.54180: variable 'ansible_facts' from source: unknown 13273 1726853286.54182: _low_level_execute_command(): starting 13273 1726853286.54184: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13273 1726853286.55422: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13273 1726853286.55485: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration <<< 13273 1726853286.55639: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853286.55645: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853286.55738: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853286.55809: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 13273 1726853286.58261: stdout chunk (state=3): >>>/root <<< 13273 1726853286.58424: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853286.58435: stdout chunk (state=3): >>><<< 13273 1726853286.58438: stderr chunk (state=3): >>><<< 13273 1726853286.58631: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 13273 1726853286.58635: _low_level_execute_command(): starting 13273 1726853286.58637: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853286.5859838-13538-26056647452061 `" && echo ansible-tmp-1726853286.5859838-13538-26056647452061="` echo /root/.ansible/tmp/ansible-tmp-1726853286.5859838-13538-26056647452061 `" ) && sleep 0' 13273 1726853286.59816: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13273 1726853286.59976: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853286.59980: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853286.59983: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13273 1726853286.59985: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 13273 1726853286.59987: stderr chunk (state=3): >>>debug2: match not found <<< 13273 1726853286.60014: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853286.60079: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853286.60103: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853286.60125: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853286.60248: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 13273 1726853286.63161: stdout chunk (state=3): >>>ansible-tmp-1726853286.5859838-13538-26056647452061=/root/.ansible/tmp/ansible-tmp-1726853286.5859838-13538-26056647452061 <<< 13273 1726853286.63248: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853286.63382: stderr chunk (state=3): >>><<< 13273 1726853286.63385: stdout chunk (state=3): >>><<< 13273 1726853286.63388: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853286.5859838-13538-26056647452061=/root/.ansible/tmp/ansible-tmp-1726853286.5859838-13538-26056647452061 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 13273 1726853286.63521: variable 'ansible_module_compression' from source: unknown 13273 1726853286.63555: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13273pukop8ph/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 13273 1726853286.63697: variable 'ansible_facts' from source: unknown 13273 1726853286.64316: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853286.5859838-13538-26056647452061/AnsiballZ_setup.py 13273 1726853286.64887: Sending initial data 13273 1726853286.64890: Sent initial data (153 bytes) 13273 1726853286.67489: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853286.67673: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853286.67788: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 13273 1726853286.70133: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13273 1726853286.70191: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13273 1726853286.70258: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13273pukop8ph/tmp0nerhjo5 /root/.ansible/tmp/ansible-tmp-1726853286.5859838-13538-26056647452061/AnsiballZ_setup.py <<< 13273 1726853286.70278: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853286.5859838-13538-26056647452061/AnsiballZ_setup.py" <<< 13273 1726853286.70353: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13273pukop8ph/tmp0nerhjo5" to remote "/root/.ansible/tmp/ansible-tmp-1726853286.5859838-13538-26056647452061/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853286.5859838-13538-26056647452061/AnsiballZ_setup.py" <<< 13273 1726853286.72632: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853286.72692: stderr chunk (state=3): >>><<< 13273 1726853286.72702: stdout chunk (state=3): >>><<< 13273 1726853286.72727: done transferring module to remote 13273 1726853286.72746: _low_level_execute_command(): starting 13273 1726853286.72759: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853286.5859838-13538-26056647452061/ /root/.ansible/tmp/ansible-tmp-1726853286.5859838-13538-26056647452061/AnsiballZ_setup.py && sleep 0' 13273 1726853286.73779: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration <<< 13273 1726853286.73882: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853286.73913: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853286.74037: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 13273 1726853286.76875: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853286.76880: stdout chunk (state=3): >>><<< 13273 1726853286.76882: stderr chunk (state=3): >>><<< 13273 1726853286.77086: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 13273 1726853286.77089: _low_level_execute_command(): starting 13273 1726853286.77092: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853286.5859838-13538-26056647452061/AnsiballZ_setup.py && sleep 0' 13273 1726853286.78327: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 13273 1726853286.78349: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853286.78369: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853286.78391: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853286.78496: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 13273 1726853287.45278: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_is_chroot": false, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCoYDjPBT+oZMoH+Iz89VMad3pzgkIKqOuUO8QZPyEpVfgGNjVOaglgkXLQyOXulclB6EA4nlBVmVP6IyWL+N4gskjf5Qmm5n+WHu3amXXm9l66v+yKWruV18sgIn8o6iAdCgZrJFzdFVNeKRLYV6P67syyegqRkOJe7U2m/rxA967Vm6wcrwJN8eiZc8tbx1lHOuJNCcP20ThNxMHIPfvT2im8rlt/ojf6e+Z1axRnvFBubBg1qDfxHEX6AlxMHl+CIOXxGHxsvSxfLq7lhrXrComBSxTDv+fmHeJkck3VGck2rn8Hz3eBTty453RU3pq6KxdqU1GB+Z+VYHInXEtXb2i38QpLqRfiLqwUxdjkFKqz0j5z/+NskQF3w8j6uz77Revs9G8eMs14sICJtnD9qBLhFuGxlR/ovZCjynVjBTKBfDaVASpjV0iCIZKSgLn6zSaM/6FBhnBoeb4ch2iiw2S8Q0aKJNKIp0AfY21IVplyS3VCu+Br3fIXzoINo0s=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBIX3PaAHBZzPq23oQJkywX/2bB39yz9ZrSLgsNfhL04NHwnY0Up/oiN+aiUte1DWFqV5wiDLJpl9a1tDLARWXNA=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIBb2CS4KVp6KVVqnGA45j7tkSkijXfGxbd3neKpDsjh9", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_local": {}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-11-217.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-11-217", "ansible_nodename": "ip-10-31-11-217.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2a85cf21f17783c9da20681cb8e352", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_lsb": {}, "ansible_apparmor": {"status": "disabled"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_fips": false, "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "12:2a:53:36:f0:e9", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.11.217", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::102a:53ff:fe36:f0e9", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", <<< 13273 1726853287.45364: stdout chunk (state=3): >>>"tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.11.217", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:2a:53:36:f0:e9", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.11.217"], "ansible_all_ipv6_addresses": ["fe80::102a:53ff:fe36:f0e9"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.11.217", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::102a:53ff:fe36:f0e9"]}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/1", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.46.199 55604 10.31.11.217 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.46.199 55604 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "28", "second": "07", "epoch": "1726853287", "epoch_int": "1726853287", "date": "2024-09-20", "time": "13:28:07", "iso8601_micro": "2024-09-20T17:28:07.143689Z", "iso8601": "2024-09-20T17:28:07Z", "iso8601_basic": "20240920T132807143689", "iso8601_basic_short": "20240920T132807", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_loadavg": {"1m": 0.83056640625, "5m": 0.41162109375, "15m": 0.18798828125}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2993, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 538, "free": 2993}, "nocache": {"free": 3307, "used": 224}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2a85cf-21f1-7783-c9da-20681cb8e352", "ansible_product_uuid": "ec2a85cf-21f1-7783-c9da-20681cb8e352", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 431, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261805641728, "block_size": 4096, "block_total": 65519099, "block_available": 63917393, "block_used": 1601706, "inode_total": 131070960, "inode_available": 131029153, "inode_used": 41807, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_fibre_channel_wwn": [], "ansible_iscsi_iqn": "", "ansible_pkg_mgr": "dnf", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 13273 1726853287.47449: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 13273 1726853287.47453: stdout chunk (state=3): >>><<< 13273 1726853287.47455: stderr chunk (state=3): >>><<< 13273 1726853287.47679: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_is_chroot": false, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCoYDjPBT+oZMoH+Iz89VMad3pzgkIKqOuUO8QZPyEpVfgGNjVOaglgkXLQyOXulclB6EA4nlBVmVP6IyWL+N4gskjf5Qmm5n+WHu3amXXm9l66v+yKWruV18sgIn8o6iAdCgZrJFzdFVNeKRLYV6P67syyegqRkOJe7U2m/rxA967Vm6wcrwJN8eiZc8tbx1lHOuJNCcP20ThNxMHIPfvT2im8rlt/ojf6e+Z1axRnvFBubBg1qDfxHEX6AlxMHl+CIOXxGHxsvSxfLq7lhrXrComBSxTDv+fmHeJkck3VGck2rn8Hz3eBTty453RU3pq6KxdqU1GB+Z+VYHInXEtXb2i38QpLqRfiLqwUxdjkFKqz0j5z/+NskQF3w8j6uz77Revs9G8eMs14sICJtnD9qBLhFuGxlR/ovZCjynVjBTKBfDaVASpjV0iCIZKSgLn6zSaM/6FBhnBoeb4ch2iiw2S8Q0aKJNKIp0AfY21IVplyS3VCu+Br3fIXzoINo0s=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBIX3PaAHBZzPq23oQJkywX/2bB39yz9ZrSLgsNfhL04NHwnY0Up/oiN+aiUte1DWFqV5wiDLJpl9a1tDLARWXNA=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIBb2CS4KVp6KVVqnGA45j7tkSkijXfGxbd3neKpDsjh9", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_local": {}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-11-217.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-11-217", "ansible_nodename": "ip-10-31-11-217.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2a85cf21f17783c9da20681cb8e352", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_lsb": {}, "ansible_apparmor": {"status": "disabled"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_fips": false, "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "12:2a:53:36:f0:e9", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.11.217", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::102a:53ff:fe36:f0e9", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.11.217", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:2a:53:36:f0:e9", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.11.217"], "ansible_all_ipv6_addresses": ["fe80::102a:53ff:fe36:f0e9"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.11.217", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::102a:53ff:fe36:f0e9"]}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/1", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.46.199 55604 10.31.11.217 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.46.199 55604 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "28", "second": "07", "epoch": "1726853287", "epoch_int": "1726853287", "date": "2024-09-20", "time": "13:28:07", "iso8601_micro": "2024-09-20T17:28:07.143689Z", "iso8601": "2024-09-20T17:28:07Z", "iso8601_basic": "20240920T132807143689", "iso8601_basic_short": "20240920T132807", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_loadavg": {"1m": 0.83056640625, "5m": 0.41162109375, "15m": 0.18798828125}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2993, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 538, "free": 2993}, "nocache": {"free": 3307, "used": 224}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2a85cf-21f1-7783-c9da-20681cb8e352", "ansible_product_uuid": "ec2a85cf-21f1-7783-c9da-20681cb8e352", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 431, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261805641728, "block_size": 4096, "block_total": 65519099, "block_available": 63917393, "block_used": 1601706, "inode_total": 131070960, "inode_available": 131029153, "inode_used": 41807, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_fibre_channel_wwn": [], "ansible_iscsi_iqn": "", "ansible_pkg_mgr": "dnf", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 13273 1726853287.47955: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853286.5859838-13538-26056647452061/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13273 1726853287.47989: _low_level_execute_command(): starting 13273 1726853287.48000: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853286.5859838-13538-26056647452061/ > /dev/null 2>&1 && sleep 0' 13273 1726853287.48732: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13273 1726853287.48803: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853287.48880: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853287.48936: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853287.49136: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853287.51177: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853287.51181: stdout chunk (state=3): >>><<< 13273 1726853287.51184: stderr chunk (state=3): >>><<< 13273 1726853287.51187: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853287.51190: handler run complete 13273 1726853287.51193: variable 'ansible_facts' from source: unknown 13273 1726853287.51332: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853287.51640: variable 'ansible_facts' from source: unknown 13273 1726853287.51799: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853287.52051: attempt loop complete, returning result 13273 1726853287.52088: _execute() done 13273 1726853287.52209: dumping result to json 13273 1726853287.52306: done dumping result, returning 13273 1726853287.52318: done running TaskExecutor() for managed_node3/TASK: Gathering Facts [02083763-bbaf-5fc3-657d-000000000218] 13273 1726853287.52338: sending task result for task 02083763-bbaf-5fc3-657d-000000000218 13273 1726853287.53367: done sending task result for task 02083763-bbaf-5fc3-657d-000000000218 13273 1726853287.53370: WORKER PROCESS EXITING ok: [managed_node3] 13273 1726853287.53983: no more pending results, returning what we have 13273 1726853287.53985: results queue empty 13273 1726853287.53986: checking for any_errors_fatal 13273 1726853287.53987: done checking for any_errors_fatal 13273 1726853287.53988: checking for max_fail_percentage 13273 1726853287.53989: done checking for max_fail_percentage 13273 1726853287.53990: checking to see if all hosts have failed and the running result is not ok 13273 1726853287.53991: done checking to see if all hosts have failed 13273 1726853287.53992: getting the remaining hosts for this loop 13273 1726853287.53993: done getting the remaining hosts for this loop 13273 1726853287.53996: getting the next task for host managed_node3 13273 1726853287.54001: done getting next task for host managed_node3 13273 1726853287.54003: ^ task is: TASK: meta (flush_handlers) 13273 1726853287.54005: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853287.54008: getting variables 13273 1726853287.54009: in VariableManager get_vars() 13273 1726853287.54053: Calling all_inventory to load vars for managed_node3 13273 1726853287.54095: Calling groups_inventory to load vars for managed_node3 13273 1726853287.54099: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853287.54109: Calling all_plugins_play to load vars for managed_node3 13273 1726853287.54112: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853287.54115: Calling groups_plugins_play to load vars for managed_node3 13273 1726853287.54345: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853287.54550: done with get_vars() 13273 1726853287.54561: done getting variables 13273 1726853287.54642: in VariableManager get_vars() 13273 1726853287.54661: Calling all_inventory to load vars for managed_node3 13273 1726853287.54664: Calling groups_inventory to load vars for managed_node3 13273 1726853287.54666: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853287.54673: Calling all_plugins_play to load vars for managed_node3 13273 1726853287.54675: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853287.54678: Calling groups_plugins_play to load vars for managed_node3 13273 1726853287.54978: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853287.55184: done with get_vars() 13273 1726853287.55198: done queuing things up, now waiting for results queue to drain 13273 1726853287.55200: results queue empty 13273 1726853287.55201: checking for any_errors_fatal 13273 1726853287.55204: done checking for any_errors_fatal 13273 1726853287.55205: checking for max_fail_percentage 13273 1726853287.55206: done checking for max_fail_percentage 13273 1726853287.55207: checking to see if all hosts have failed and the running result is not ok 13273 1726853287.55208: done checking to see if all hosts have failed 13273 1726853287.55213: getting the remaining hosts for this loop 13273 1726853287.55214: done getting the remaining hosts for this loop 13273 1726853287.55252: getting the next task for host managed_node3 13273 1726853287.55256: done getting next task for host managed_node3 13273 1726853287.55258: ^ task is: TASK: INIT Prepare setup 13273 1726853287.55265: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853287.55267: getting variables 13273 1726853287.55268: in VariableManager get_vars() 13273 1726853287.55292: Calling all_inventory to load vars for managed_node3 13273 1726853287.55298: Calling groups_inventory to load vars for managed_node3 13273 1726853287.55300: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853287.55304: Calling all_plugins_play to load vars for managed_node3 13273 1726853287.55306: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853287.55309: Calling groups_plugins_play to load vars for managed_node3 13273 1726853287.55447: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853287.55891: done with get_vars() 13273 1726853287.55898: done getting variables 13273 1726853287.56004: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [INIT Prepare setup] ****************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:15 Friday 20 September 2024 13:28:07 -0400 (0:00:01.046) 0:00:05.449 ****** 13273 1726853287.56107: entering _queue_task() for managed_node3/debug 13273 1726853287.56110: Creating lock for debug 13273 1726853287.56616: worker is 1 (out of 1 available) 13273 1726853287.56628: exiting _queue_task() for managed_node3/debug 13273 1726853287.56639: done queuing things up, now waiting for results queue to drain 13273 1726853287.56640: waiting for pending results... 13273 1726853287.57219: running TaskExecutor() for managed_node3/TASK: INIT Prepare setup 13273 1726853287.57259: in run() - task 02083763-bbaf-5fc3-657d-00000000000b 13273 1726853287.57287: variable 'ansible_search_path' from source: unknown 13273 1726853287.57415: calling self._execute() 13273 1726853287.57823: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853287.57827: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853287.57830: variable 'omit' from source: magic vars 13273 1726853287.58566: variable 'ansible_distribution_major_version' from source: facts 13273 1726853287.58591: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853287.58604: variable 'omit' from source: magic vars 13273 1726853287.58634: variable 'omit' from source: magic vars 13273 1726853287.58675: variable 'omit' from source: magic vars 13273 1726853287.58720: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13273 1726853287.58765: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13273 1726853287.58791: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13273 1726853287.58818: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853287.58835: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853287.58878: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13273 1726853287.58894: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853287.58903: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853287.59012: Set connection var ansible_connection to ssh 13273 1726853287.59031: Set connection var ansible_module_compression to ZIP_DEFLATED 13273 1726853287.59039: Set connection var ansible_shell_executable to /bin/sh 13273 1726853287.59044: Set connection var ansible_shell_type to sh 13273 1726853287.59053: Set connection var ansible_pipelining to False 13273 1726853287.59070: Set connection var ansible_timeout to 10 13273 1726853287.59182: variable 'ansible_shell_executable' from source: unknown 13273 1726853287.59188: variable 'ansible_connection' from source: unknown 13273 1726853287.59194: variable 'ansible_module_compression' from source: unknown 13273 1726853287.59200: variable 'ansible_shell_type' from source: unknown 13273 1726853287.59205: variable 'ansible_shell_executable' from source: unknown 13273 1726853287.59209: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853287.59240: variable 'ansible_pipelining' from source: unknown 13273 1726853287.59243: variable 'ansible_timeout' from source: unknown 13273 1726853287.59245: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853287.59482: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13273 1726853287.59498: variable 'omit' from source: magic vars 13273 1726853287.59518: starting attempt loop 13273 1726853287.59527: running the handler 13273 1726853287.59626: handler run complete 13273 1726853287.59713: attempt loop complete, returning result 13273 1726853287.59716: _execute() done 13273 1726853287.59784: dumping result to json 13273 1726853287.59788: done dumping result, returning 13273 1726853287.59791: done running TaskExecutor() for managed_node3/TASK: INIT Prepare setup [02083763-bbaf-5fc3-657d-00000000000b] 13273 1726853287.59794: sending task result for task 02083763-bbaf-5fc3-657d-00000000000b 13273 1726853287.59999: done sending task result for task 02083763-bbaf-5fc3-657d-00000000000b 13273 1726853287.60003: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: ################################################## 13273 1726853287.60061: no more pending results, returning what we have 13273 1726853287.60065: results queue empty 13273 1726853287.60066: checking for any_errors_fatal 13273 1726853287.60068: done checking for any_errors_fatal 13273 1726853287.60069: checking for max_fail_percentage 13273 1726853287.60088: done checking for max_fail_percentage 13273 1726853287.60089: checking to see if all hosts have failed and the running result is not ok 13273 1726853287.60090: done checking to see if all hosts have failed 13273 1726853287.60091: getting the remaining hosts for this loop 13273 1726853287.60093: done getting the remaining hosts for this loop 13273 1726853287.60097: getting the next task for host managed_node3 13273 1726853287.60103: done getting next task for host managed_node3 13273 1726853287.60107: ^ task is: TASK: Install dnsmasq 13273 1726853287.60115: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853287.60121: getting variables 13273 1726853287.60123: in VariableManager get_vars() 13273 1726853287.60390: Calling all_inventory to load vars for managed_node3 13273 1726853287.60393: Calling groups_inventory to load vars for managed_node3 13273 1726853287.60395: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853287.60403: Calling all_plugins_play to load vars for managed_node3 13273 1726853287.60406: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853287.60408: Calling groups_plugins_play to load vars for managed_node3 13273 1726853287.60684: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853287.60907: done with get_vars() 13273 1726853287.60919: done getting variables 13273 1726853287.60975: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Install dnsmasq] ********************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:3 Friday 20 September 2024 13:28:07 -0400 (0:00:00.049) 0:00:05.499 ****** 13273 1726853287.61007: entering _queue_task() for managed_node3/package 13273 1726853287.61301: worker is 1 (out of 1 available) 13273 1726853287.61314: exiting _queue_task() for managed_node3/package 13273 1726853287.61331: done queuing things up, now waiting for results queue to drain 13273 1726853287.61332: waiting for pending results... 13273 1726853287.61588: running TaskExecutor() for managed_node3/TASK: Install dnsmasq 13273 1726853287.61647: in run() - task 02083763-bbaf-5fc3-657d-00000000000f 13273 1726853287.61668: variable 'ansible_search_path' from source: unknown 13273 1726853287.61679: variable 'ansible_search_path' from source: unknown 13273 1726853287.61721: calling self._execute() 13273 1726853287.61876: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853287.61879: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853287.61884: variable 'omit' from source: magic vars 13273 1726853287.62191: variable 'ansible_distribution_major_version' from source: facts 13273 1726853287.62208: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853287.62219: variable 'omit' from source: magic vars 13273 1726853287.62275: variable 'omit' from source: magic vars 13273 1726853287.62589: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13273 1726853287.64577: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13273 1726853287.64649: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13273 1726853287.64691: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13273 1726853287.64731: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13273 1726853287.64761: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13273 1726853287.64875: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853287.64905: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853287.64931: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853287.65049: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853287.65053: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853287.65115: variable '__network_is_ostree' from source: set_fact 13273 1726853287.65125: variable 'omit' from source: magic vars 13273 1726853287.65155: variable 'omit' from source: magic vars 13273 1726853287.65190: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13273 1726853287.65577: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13273 1726853287.65580: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13273 1726853287.65583: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853287.65585: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853287.65587: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13273 1726853287.65588: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853287.65590: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853287.65592: Set connection var ansible_connection to ssh 13273 1726853287.65976: Set connection var ansible_module_compression to ZIP_DEFLATED 13273 1726853287.65978: Set connection var ansible_shell_executable to /bin/sh 13273 1726853287.65981: Set connection var ansible_shell_type to sh 13273 1726853287.65984: Set connection var ansible_pipelining to False 13273 1726853287.65989: Set connection var ansible_timeout to 10 13273 1726853287.65992: variable 'ansible_shell_executable' from source: unknown 13273 1726853287.65994: variable 'ansible_connection' from source: unknown 13273 1726853287.65998: variable 'ansible_module_compression' from source: unknown 13273 1726853287.66000: variable 'ansible_shell_type' from source: unknown 13273 1726853287.66002: variable 'ansible_shell_executable' from source: unknown 13273 1726853287.66004: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853287.66005: variable 'ansible_pipelining' from source: unknown 13273 1726853287.66007: variable 'ansible_timeout' from source: unknown 13273 1726853287.66009: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853287.66012: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13273 1726853287.66014: variable 'omit' from source: magic vars 13273 1726853287.66016: starting attempt loop 13273 1726853287.66018: running the handler 13273 1726853287.66020: variable 'ansible_facts' from source: unknown 13273 1726853287.66022: variable 'ansible_facts' from source: unknown 13273 1726853287.66533: _low_level_execute_command(): starting 13273 1726853287.66537: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13273 1726853287.67828: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853287.67832: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853287.67835: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853287.67837: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853287.68197: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853287.69996: stdout chunk (state=3): >>>/root <<< 13273 1726853287.70000: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853287.70003: stdout chunk (state=3): >>><<< 13273 1726853287.70005: stderr chunk (state=3): >>><<< 13273 1726853287.70030: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853287.70162: _low_level_execute_command(): starting 13273 1726853287.70167: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853287.7012787-13592-129888185162547 `" && echo ansible-tmp-1726853287.7012787-13592-129888185162547="` echo /root/.ansible/tmp/ansible-tmp-1726853287.7012787-13592-129888185162547 `" ) && sleep 0' 13273 1726853287.72091: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853287.72378: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853287.72489: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853287.74778: stdout chunk (state=3): >>>ansible-tmp-1726853287.7012787-13592-129888185162547=/root/.ansible/tmp/ansible-tmp-1726853287.7012787-13592-129888185162547 <<< 13273 1726853287.74782: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853287.74784: stdout chunk (state=3): >>><<< 13273 1726853287.74786: stderr chunk (state=3): >>><<< 13273 1726853287.74788: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853287.7012787-13592-129888185162547=/root/.ansible/tmp/ansible-tmp-1726853287.7012787-13592-129888185162547 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853287.74790: variable 'ansible_module_compression' from source: unknown 13273 1726853287.74793: ANSIBALLZ: Using generic lock for ansible.legacy.dnf 13273 1726853287.74796: ANSIBALLZ: Acquiring lock 13273 1726853287.74798: ANSIBALLZ: Lock acquired: 140136094830320 13273 1726853287.74799: ANSIBALLZ: Creating module 13273 1726853287.99686: ANSIBALLZ: Writing module into payload 13273 1726853287.99905: ANSIBALLZ: Writing module 13273 1726853287.99934: ANSIBALLZ: Renaming module 13273 1726853287.99943: ANSIBALLZ: Done creating module 13273 1726853287.99964: variable 'ansible_facts' from source: unknown 13273 1726853288.00073: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853287.7012787-13592-129888185162547/AnsiballZ_dnf.py 13273 1726853288.00242: Sending initial data 13273 1726853288.00251: Sent initial data (152 bytes) 13273 1726853288.01311: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853288.01334: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853288.01402: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853288.01414: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853288.01434: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853288.01543: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853288.03229: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 13273 1726853288.03233: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13273 1726853288.03627: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13273 1726853288.03631: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13273pukop8ph/tmp1g8b4jhd /root/.ansible/tmp/ansible-tmp-1726853287.7012787-13592-129888185162547/AnsiballZ_dnf.py <<< 13273 1726853288.03633: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853287.7012787-13592-129888185162547/AnsiballZ_dnf.py" <<< 13273 1726853288.03674: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13273pukop8ph/tmp1g8b4jhd" to remote "/root/.ansible/tmp/ansible-tmp-1726853287.7012787-13592-129888185162547/AnsiballZ_dnf.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853287.7012787-13592-129888185162547/AnsiballZ_dnf.py" <<< 13273 1726853288.05337: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853288.05427: stderr chunk (state=3): >>><<< 13273 1726853288.05441: stdout chunk (state=3): >>><<< 13273 1726853288.05474: done transferring module to remote 13273 1726853288.05563: _low_level_execute_command(): starting 13273 1726853288.05580: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853287.7012787-13592-129888185162547/ /root/.ansible/tmp/ansible-tmp-1726853287.7012787-13592-129888185162547/AnsiballZ_dnf.py && sleep 0' 13273 1726853288.06353: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13273 1726853288.06522: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853288.06778: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853288.06831: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853288.08737: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853288.08750: stdout chunk (state=3): >>><<< 13273 1726853288.08762: stderr chunk (state=3): >>><<< 13273 1726853288.08785: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853288.08794: _low_level_execute_command(): starting 13273 1726853288.08803: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853287.7012787-13592-129888185162547/AnsiballZ_dnf.py && sleep 0' 13273 1726853288.09840: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13273 1726853288.09860: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853288.09881: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853288.09901: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13273 1726853288.09918: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 13273 1726853288.10093: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853288.10289: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853288.10405: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853288.52814: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["dnsmasq"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 13273 1726853288.57285: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 13273 1726853288.57304: stdout chunk (state=3): >>><<< 13273 1726853288.57316: stderr chunk (state=3): >>><<< 13273 1726853288.57485: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["dnsmasq"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 13273 1726853288.57489: done with _execute_module (ansible.legacy.dnf, {'name': 'dnsmasq', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853287.7012787-13592-129888185162547/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13273 1726853288.57712: _low_level_execute_command(): starting 13273 1726853288.57716: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853287.7012787-13592-129888185162547/ > /dev/null 2>&1 && sleep 0' 13273 1726853288.58749: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13273 1726853288.58755: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853288.58765: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853288.58781: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13273 1726853288.58793: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 13273 1726853288.58798: stderr chunk (state=3): >>>debug2: match not found <<< 13273 1726853288.59064: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853288.59074: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853288.59167: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853288.61679: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853288.61683: stdout chunk (state=3): >>><<< 13273 1726853288.61685: stderr chunk (state=3): >>><<< 13273 1726853288.61689: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853288.61692: handler run complete 13273 1726853288.61934: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13273 1726853288.62608: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13273 1726853288.62648: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13273 1726853288.62684: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13273 1726853288.62710: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13273 1726853288.63186: variable '__install_status' from source: unknown 13273 1726853288.63207: Evaluated conditional (__install_status is success): True 13273 1726853288.63227: attempt loop complete, returning result 13273 1726853288.63230: _execute() done 13273 1726853288.63232: dumping result to json 13273 1726853288.63234: done dumping result, returning 13273 1726853288.63246: done running TaskExecutor() for managed_node3/TASK: Install dnsmasq [02083763-bbaf-5fc3-657d-00000000000f] 13273 1726853288.63248: sending task result for task 02083763-bbaf-5fc3-657d-00000000000f ok: [managed_node3] => { "attempts": 1, "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 13273 1726853288.63617: no more pending results, returning what we have 13273 1726853288.63620: results queue empty 13273 1726853288.63622: checking for any_errors_fatal 13273 1726853288.63629: done checking for any_errors_fatal 13273 1726853288.63630: checking for max_fail_percentage 13273 1726853288.63631: done checking for max_fail_percentage 13273 1726853288.63632: checking to see if all hosts have failed and the running result is not ok 13273 1726853288.63633: done checking to see if all hosts have failed 13273 1726853288.63633: getting the remaining hosts for this loop 13273 1726853288.63635: done getting the remaining hosts for this loop 13273 1726853288.63638: getting the next task for host managed_node3 13273 1726853288.63653: done getting next task for host managed_node3 13273 1726853288.63656: ^ task is: TASK: Install pgrep, sysctl 13273 1726853288.63659: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853288.63663: getting variables 13273 1726853288.63665: in VariableManager get_vars() 13273 1726853288.63790: Calling all_inventory to load vars for managed_node3 13273 1726853288.63793: Calling groups_inventory to load vars for managed_node3 13273 1726853288.63795: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853288.63818: done sending task result for task 02083763-bbaf-5fc3-657d-00000000000f 13273 1726853288.63821: WORKER PROCESS EXITING 13273 1726853288.63856: Calling all_plugins_play to load vars for managed_node3 13273 1726853288.63860: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853288.63863: Calling groups_plugins_play to load vars for managed_node3 13273 1726853288.64325: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853288.64865: done with get_vars() 13273 1726853288.64879: done getting variables 13273 1726853288.65066: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Install pgrep, sysctl] *************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:17 Friday 20 September 2024 13:28:08 -0400 (0:00:01.040) 0:00:06.540 ****** 13273 1726853288.65100: entering _queue_task() for managed_node3/package 13273 1726853288.66006: worker is 1 (out of 1 available) 13273 1726853288.66018: exiting _queue_task() for managed_node3/package 13273 1726853288.66033: done queuing things up, now waiting for results queue to drain 13273 1726853288.66034: waiting for pending results... 13273 1726853288.66696: running TaskExecutor() for managed_node3/TASK: Install pgrep, sysctl 13273 1726853288.66717: in run() - task 02083763-bbaf-5fc3-657d-000000000010 13273 1726853288.66795: variable 'ansible_search_path' from source: unknown 13273 1726853288.66803: variable 'ansible_search_path' from source: unknown 13273 1726853288.66845: calling self._execute() 13273 1726853288.67076: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853288.67091: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853288.67105: variable 'omit' from source: magic vars 13273 1726853288.67667: variable 'ansible_distribution_major_version' from source: facts 13273 1726853288.67745: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853288.67856: variable 'ansible_os_family' from source: facts 13273 1726853288.67865: Evaluated conditional (ansible_os_family == 'RedHat'): True 13273 1726853288.68026: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13273 1726853288.68595: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13273 1726853288.68640: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13273 1726853288.68679: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13273 1726853288.68840: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13273 1726853288.69277: variable 'ansible_distribution_major_version' from source: facts 13273 1726853288.69280: Evaluated conditional (ansible_distribution_major_version is version('6', '<=')): False 13273 1726853288.69283: when evaluation is False, skipping this task 13273 1726853288.69286: _execute() done 13273 1726853288.69288: dumping result to json 13273 1726853288.69290: done dumping result, returning 13273 1726853288.69292: done running TaskExecutor() for managed_node3/TASK: Install pgrep, sysctl [02083763-bbaf-5fc3-657d-000000000010] 13273 1726853288.69294: sending task result for task 02083763-bbaf-5fc3-657d-000000000010 13273 1726853288.69369: done sending task result for task 02083763-bbaf-5fc3-657d-000000000010 13273 1726853288.69377: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version is version('6', '<=')", "skip_reason": "Conditional result was False" } 13273 1726853288.69431: no more pending results, returning what we have 13273 1726853288.69435: results queue empty 13273 1726853288.69436: checking for any_errors_fatal 13273 1726853288.69447: done checking for any_errors_fatal 13273 1726853288.69448: checking for max_fail_percentage 13273 1726853288.69450: done checking for max_fail_percentage 13273 1726853288.69451: checking to see if all hosts have failed and the running result is not ok 13273 1726853288.69451: done checking to see if all hosts have failed 13273 1726853288.69452: getting the remaining hosts for this loop 13273 1726853288.69454: done getting the remaining hosts for this loop 13273 1726853288.69457: getting the next task for host managed_node3 13273 1726853288.69464: done getting next task for host managed_node3 13273 1726853288.69466: ^ task is: TASK: Install pgrep, sysctl 13273 1726853288.69470: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853288.69476: getting variables 13273 1726853288.69478: in VariableManager get_vars() 13273 1726853288.69530: Calling all_inventory to load vars for managed_node3 13273 1726853288.69533: Calling groups_inventory to load vars for managed_node3 13273 1726853288.69535: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853288.69548: Calling all_plugins_play to load vars for managed_node3 13273 1726853288.69551: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853288.69553: Calling groups_plugins_play to load vars for managed_node3 13273 1726853288.70184: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853288.70619: done with get_vars() 13273 1726853288.70632: done getting variables 13273 1726853288.70755: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Install pgrep, sysctl] *************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:26 Friday 20 September 2024 13:28:08 -0400 (0:00:00.058) 0:00:06.598 ****** 13273 1726853288.70913: entering _queue_task() for managed_node3/package 13273 1726853288.72123: worker is 1 (out of 1 available) 13273 1726853288.72135: exiting _queue_task() for managed_node3/package 13273 1726853288.72150: done queuing things up, now waiting for results queue to drain 13273 1726853288.72151: waiting for pending results... 13273 1726853288.72692: running TaskExecutor() for managed_node3/TASK: Install pgrep, sysctl 13273 1726853288.73021: in run() - task 02083763-bbaf-5fc3-657d-000000000011 13273 1726853288.73078: variable 'ansible_search_path' from source: unknown 13273 1726853288.73082: variable 'ansible_search_path' from source: unknown 13273 1726853288.73101: calling self._execute() 13273 1726853288.73255: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853288.73267: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853288.73286: variable 'omit' from source: magic vars 13273 1726853288.74313: variable 'ansible_distribution_major_version' from source: facts 13273 1726853288.74333: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853288.74627: variable 'ansible_os_family' from source: facts 13273 1726853288.74652: Evaluated conditional (ansible_os_family == 'RedHat'): True 13273 1726853288.74874: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13273 1726853288.75183: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13273 1726853288.75241: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13273 1726853288.75288: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13273 1726853288.75329: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13273 1726853288.75416: variable 'ansible_distribution_major_version' from source: facts 13273 1726853288.75459: Evaluated conditional (ansible_distribution_major_version is version('7', '>=')): True 13273 1726853288.75462: variable 'omit' from source: magic vars 13273 1726853288.75506: variable 'omit' from source: magic vars 13273 1726853288.75665: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13273 1726853288.78978: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13273 1726853288.78983: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13273 1726853288.78986: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13273 1726853288.79025: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13273 1726853288.79107: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13273 1726853288.79261: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853288.79476: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853288.79479: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853288.79481: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853288.79483: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853288.79670: variable '__network_is_ostree' from source: set_fact 13273 1726853288.79876: variable 'omit' from source: magic vars 13273 1726853288.79879: variable 'omit' from source: magic vars 13273 1726853288.79882: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13273 1726853288.79884: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13273 1726853288.79979: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13273 1726853288.80000: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853288.80012: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853288.80040: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13273 1726853288.80047: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853288.80276: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853288.80279: Set connection var ansible_connection to ssh 13273 1726853288.80285: Set connection var ansible_module_compression to ZIP_DEFLATED 13273 1726853288.80294: Set connection var ansible_shell_executable to /bin/sh 13273 1726853288.80300: Set connection var ansible_shell_type to sh 13273 1726853288.80307: Set connection var ansible_pipelining to False 13273 1726853288.80314: Set connection var ansible_timeout to 10 13273 1726853288.80345: variable 'ansible_shell_executable' from source: unknown 13273 1726853288.80353: variable 'ansible_connection' from source: unknown 13273 1726853288.80399: variable 'ansible_module_compression' from source: unknown 13273 1726853288.80407: variable 'ansible_shell_type' from source: unknown 13273 1726853288.80414: variable 'ansible_shell_executable' from source: unknown 13273 1726853288.80483: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853288.80491: variable 'ansible_pipelining' from source: unknown 13273 1726853288.80503: variable 'ansible_timeout' from source: unknown 13273 1726853288.80511: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853288.80726: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13273 1726853288.80740: variable 'omit' from source: magic vars 13273 1726853288.80751: starting attempt loop 13273 1726853288.80934: running the handler 13273 1726853288.80937: variable 'ansible_facts' from source: unknown 13273 1726853288.80940: variable 'ansible_facts' from source: unknown 13273 1726853288.80942: _low_level_execute_command(): starting 13273 1726853288.80943: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13273 1726853288.82403: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13273 1726853288.82432: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853288.82435: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853288.82489: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853288.82637: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853288.82770: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853288.83001: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853288.84887: stdout chunk (state=3): >>>/root <<< 13273 1726853288.85170: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853288.85176: stdout chunk (state=3): >>><<< 13273 1726853288.85178: stderr chunk (state=3): >>><<< 13273 1726853288.85181: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853288.85184: _low_level_execute_command(): starting 13273 1726853288.85188: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853288.8511875-13654-124154657045175 `" && echo ansible-tmp-1726853288.8511875-13654-124154657045175="` echo /root/.ansible/tmp/ansible-tmp-1726853288.8511875-13654-124154657045175 `" ) && sleep 0' 13273 1726853288.86752: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853288.86963: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853288.87010: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853288.89054: stdout chunk (state=3): >>>ansible-tmp-1726853288.8511875-13654-124154657045175=/root/.ansible/tmp/ansible-tmp-1726853288.8511875-13654-124154657045175 <<< 13273 1726853288.89287: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853288.89384: stderr chunk (state=3): >>><<< 13273 1726853288.89388: stdout chunk (state=3): >>><<< 13273 1726853288.89390: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853288.8511875-13654-124154657045175=/root/.ansible/tmp/ansible-tmp-1726853288.8511875-13654-124154657045175 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853288.89477: variable 'ansible_module_compression' from source: unknown 13273 1726853288.89626: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13273pukop8ph/ansiballz_cache/ansible.modules.dnf-ZIP_DEFLATED 13273 1726853288.89820: variable 'ansible_facts' from source: unknown 13273 1726853288.89953: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853288.8511875-13654-124154657045175/AnsiballZ_dnf.py 13273 1726853288.90414: Sending initial data 13273 1726853288.90418: Sent initial data (152 bytes) 13273 1726853288.91440: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13273 1726853288.91599: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853288.91862: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853288.91925: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853288.93603: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13273 1726853288.93702: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13273 1726853288.93817: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13273pukop8ph/tmpt3hnurqe /root/.ansible/tmp/ansible-tmp-1726853288.8511875-13654-124154657045175/AnsiballZ_dnf.py <<< 13273 1726853288.93821: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853288.8511875-13654-124154657045175/AnsiballZ_dnf.py" <<< 13273 1726853288.93911: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13273pukop8ph/tmpt3hnurqe" to remote "/root/.ansible/tmp/ansible-tmp-1726853288.8511875-13654-124154657045175/AnsiballZ_dnf.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853288.8511875-13654-124154657045175/AnsiballZ_dnf.py" <<< 13273 1726853288.96266: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853288.96583: stderr chunk (state=3): >>><<< 13273 1726853288.96587: stdout chunk (state=3): >>><<< 13273 1726853288.96589: done transferring module to remote 13273 1726853288.96592: _low_level_execute_command(): starting 13273 1726853288.96594: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853288.8511875-13654-124154657045175/ /root/.ansible/tmp/ansible-tmp-1726853288.8511875-13654-124154657045175/AnsiballZ_dnf.py && sleep 0' 13273 1726853288.97736: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13273 1726853288.97856: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853288.97925: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853288.98111: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853288.98180: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853289.00150: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853289.00154: stdout chunk (state=3): >>><<< 13273 1726853289.00156: stderr chunk (state=3): >>><<< 13273 1726853289.00289: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853289.00298: _low_level_execute_command(): starting 13273 1726853289.00309: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853288.8511875-13654-124154657045175/AnsiballZ_dnf.py && sleep 0' 13273 1726853289.01567: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13273 1726853289.01585: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853289.01616: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853289.01682: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13273 1726853289.01695: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853289.01778: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853289.01795: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853289.01817: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853289.01920: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853289.44846: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["procps-ng"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 13273 1726853289.49645: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 13273 1726853289.49767: stdout chunk (state=3): >>><<< 13273 1726853289.49774: stderr chunk (state=3): >>><<< 13273 1726853289.49777: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["procps-ng"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 13273 1726853289.49791: done with _execute_module (ansible.legacy.dnf, {'name': 'procps-ng', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853288.8511875-13654-124154657045175/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13273 1726853289.49804: _low_level_execute_command(): starting 13273 1726853289.49814: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853288.8511875-13654-124154657045175/ > /dev/null 2>&1 && sleep 0' 13273 1726853289.50701: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13273 1726853289.50717: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853289.50780: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 13273 1726853289.50793: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853289.50877: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853289.50900: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853289.51209: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853289.53281: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853289.53286: stdout chunk (state=3): >>><<< 13273 1726853289.53288: stderr chunk (state=3): >>><<< 13273 1726853289.53293: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853289.53295: handler run complete 13273 1726853289.53297: attempt loop complete, returning result 13273 1726853289.53298: _execute() done 13273 1726853289.53300: dumping result to json 13273 1726853289.53302: done dumping result, returning 13273 1726853289.53303: done running TaskExecutor() for managed_node3/TASK: Install pgrep, sysctl [02083763-bbaf-5fc3-657d-000000000011] 13273 1726853289.53305: sending task result for task 02083763-bbaf-5fc3-657d-000000000011 13273 1726853289.53684: done sending task result for task 02083763-bbaf-5fc3-657d-000000000011 13273 1726853289.53878: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 13273 1726853289.53956: no more pending results, returning what we have 13273 1726853289.53959: results queue empty 13273 1726853289.53960: checking for any_errors_fatal 13273 1726853289.53966: done checking for any_errors_fatal 13273 1726853289.53967: checking for max_fail_percentage 13273 1726853289.53969: done checking for max_fail_percentage 13273 1726853289.53969: checking to see if all hosts have failed and the running result is not ok 13273 1726853289.53970: done checking to see if all hosts have failed 13273 1726853289.53973: getting the remaining hosts for this loop 13273 1726853289.53974: done getting the remaining hosts for this loop 13273 1726853289.53978: getting the next task for host managed_node3 13273 1726853289.53988: done getting next task for host managed_node3 13273 1726853289.53991: ^ task is: TASK: Create test interfaces 13273 1726853289.53995: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853289.53999: getting variables 13273 1726853289.54000: in VariableManager get_vars() 13273 1726853289.54056: Calling all_inventory to load vars for managed_node3 13273 1726853289.54060: Calling groups_inventory to load vars for managed_node3 13273 1726853289.54140: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853289.54154: Calling all_plugins_play to load vars for managed_node3 13273 1726853289.54157: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853289.54159: Calling groups_plugins_play to load vars for managed_node3 13273 1726853289.54697: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853289.55147: done with get_vars() 13273 1726853289.55159: done getting variables 13273 1726853289.55260: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Create test interfaces] ************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:35 Friday 20 September 2024 13:28:09 -0400 (0:00:00.845) 0:00:07.443 ****** 13273 1726853289.55427: entering _queue_task() for managed_node3/shell 13273 1726853289.55429: Creating lock for shell 13273 1726853289.55812: worker is 1 (out of 1 available) 13273 1726853289.55826: exiting _queue_task() for managed_node3/shell 13273 1726853289.55845: done queuing things up, now waiting for results queue to drain 13273 1726853289.55846: waiting for pending results... 13273 1726853289.56018: running TaskExecutor() for managed_node3/TASK: Create test interfaces 13273 1726853289.56137: in run() - task 02083763-bbaf-5fc3-657d-000000000012 13273 1726853289.56377: variable 'ansible_search_path' from source: unknown 13273 1726853289.56380: variable 'ansible_search_path' from source: unknown 13273 1726853289.56383: calling self._execute() 13273 1726853289.56386: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853289.56388: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853289.56391: variable 'omit' from source: magic vars 13273 1726853289.56673: variable 'ansible_distribution_major_version' from source: facts 13273 1726853289.56692: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853289.56702: variable 'omit' from source: magic vars 13273 1726853289.56757: variable 'omit' from source: magic vars 13273 1726853289.57132: variable 'dhcp_interface1' from source: play vars 13273 1726853289.57142: variable 'dhcp_interface2' from source: play vars 13273 1726853289.57182: variable 'omit' from source: magic vars 13273 1726853289.57220: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13273 1726853289.57257: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13273 1726853289.57286: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13273 1726853289.57383: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853289.57477: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853289.57481: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13273 1726853289.57483: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853289.57486: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853289.57542: Set connection var ansible_connection to ssh 13273 1726853289.57556: Set connection var ansible_module_compression to ZIP_DEFLATED 13273 1726853289.57562: Set connection var ansible_shell_executable to /bin/sh 13273 1726853289.57564: Set connection var ansible_shell_type to sh 13273 1726853289.57570: Set connection var ansible_pipelining to False 13273 1726853289.57577: Set connection var ansible_timeout to 10 13273 1726853289.57611: variable 'ansible_shell_executable' from source: unknown 13273 1726853289.57614: variable 'ansible_connection' from source: unknown 13273 1726853289.57617: variable 'ansible_module_compression' from source: unknown 13273 1726853289.57619: variable 'ansible_shell_type' from source: unknown 13273 1726853289.57621: variable 'ansible_shell_executable' from source: unknown 13273 1726853289.57623: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853289.57645: variable 'ansible_pipelining' from source: unknown 13273 1726853289.57648: variable 'ansible_timeout' from source: unknown 13273 1726853289.57651: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853289.57816: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13273 1726853289.57820: variable 'omit' from source: magic vars 13273 1726853289.57823: starting attempt loop 13273 1726853289.57825: running the handler 13273 1726853289.57828: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13273 1726853289.57838: _low_level_execute_command(): starting 13273 1726853289.57849: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13273 1726853289.58689: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853289.58713: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853289.58733: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853289.58750: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853289.58851: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853289.60564: stdout chunk (state=3): >>>/root <<< 13273 1726853289.60694: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853289.60712: stderr chunk (state=3): >>><<< 13273 1726853289.60725: stdout chunk (state=3): >>><<< 13273 1726853289.60754: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853289.60850: _low_level_execute_command(): starting 13273 1726853289.60854: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853289.6076128-13705-188326370632909 `" && echo ansible-tmp-1726853289.6076128-13705-188326370632909="` echo /root/.ansible/tmp/ansible-tmp-1726853289.6076128-13705-188326370632909 `" ) && sleep 0' 13273 1726853289.61353: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13273 1726853289.61366: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853289.61398: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853289.61415: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13273 1726853289.61433: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 13273 1726853289.61445: stderr chunk (state=3): >>>debug2: match not found <<< 13273 1726853289.61485: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853289.61550: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853289.61567: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853289.61590: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853289.61737: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853289.63715: stdout chunk (state=3): >>>ansible-tmp-1726853289.6076128-13705-188326370632909=/root/.ansible/tmp/ansible-tmp-1726853289.6076128-13705-188326370632909 <<< 13273 1726853289.63893: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853289.63897: stdout chunk (state=3): >>><<< 13273 1726853289.63899: stderr chunk (state=3): >>><<< 13273 1726853289.63925: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853289.6076128-13705-188326370632909=/root/.ansible/tmp/ansible-tmp-1726853289.6076128-13705-188326370632909 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853289.64076: variable 'ansible_module_compression' from source: unknown 13273 1726853289.64080: ANSIBALLZ: Using generic lock for ansible.legacy.command 13273 1726853289.64082: ANSIBALLZ: Acquiring lock 13273 1726853289.64084: ANSIBALLZ: Lock acquired: 140136094830320 13273 1726853289.64087: ANSIBALLZ: Creating module 13273 1726853289.76448: ANSIBALLZ: Writing module into payload 13273 1726853289.76542: ANSIBALLZ: Writing module 13273 1726853289.76565: ANSIBALLZ: Renaming module 13273 1726853289.76580: ANSIBALLZ: Done creating module 13273 1726853289.76679: variable 'ansible_facts' from source: unknown 13273 1726853289.76683: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853289.6076128-13705-188326370632909/AnsiballZ_command.py 13273 1726853289.76896: Sending initial data 13273 1726853289.76904: Sent initial data (156 bytes) 13273 1726853289.77513: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13273 1726853289.77528: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853289.77546: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853289.77576: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13273 1726853289.77680: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853289.77706: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853289.77816: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853289.79661: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13273 1726853289.79742: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13273 1726853289.79858: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13273pukop8ph/tmpgfc_bf_i /root/.ansible/tmp/ansible-tmp-1726853289.6076128-13705-188326370632909/AnsiballZ_command.py <<< 13273 1726853289.79862: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853289.6076128-13705-188326370632909/AnsiballZ_command.py" <<< 13273 1726853289.79932: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13273pukop8ph/tmpgfc_bf_i" to remote "/root/.ansible/tmp/ansible-tmp-1726853289.6076128-13705-188326370632909/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853289.6076128-13705-188326370632909/AnsiballZ_command.py" <<< 13273 1726853289.81096: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853289.81273: stderr chunk (state=3): >>><<< 13273 1726853289.81277: stdout chunk (state=3): >>><<< 13273 1726853289.81280: done transferring module to remote 13273 1726853289.81282: _low_level_execute_command(): starting 13273 1726853289.81284: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853289.6076128-13705-188326370632909/ /root/.ansible/tmp/ansible-tmp-1726853289.6076128-13705-188326370632909/AnsiballZ_command.py && sleep 0' 13273 1726853289.81902: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853289.81911: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853289.81914: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853289.82002: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853289.84172: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853289.84177: stdout chunk (state=3): >>><<< 13273 1726853289.84180: stderr chunk (state=3): >>><<< 13273 1726853289.84182: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853289.84185: _low_level_execute_command(): starting 13273 1726853289.84188: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853289.6076128-13705-188326370632909/AnsiballZ_command.py && sleep 0' 13273 1726853289.84947: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853289.84955: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13273 1726853289.84960: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 13273 1726853289.84983: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853289.84987: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration <<< 13273 1726853289.85001: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853289.85066: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 13273 1726853289.85070: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853289.85074: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853289.85077: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853289.85126: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853289.85204: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853291.23022: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "+ exec\n+ ip link add test1 type veth peer name test1p\n+ ip link add test2 type veth peer name test2p\n++ pgrep NetworkManager\n+ '[' -n 705 ']'\n+ nmcli d set test1 managed true\n+ nmcli d set test2 managed true\n+ nmcli d set test1p managed false\n+ nmcli d set test2p managed false\n+ ip link set test1p up\n+ ip link set test2p up\n+ ip link add name testbr type bridge forward_delay 0\n++ pgrep NetworkManager\n+ '[' -n 705 ']'\n+ nmcli d set testbr managed false\n+ ip link set testbr up\n+ timer=0\n+ ip addr show testbr\n+ grep -q 'inet [1-9]'\n+ let timer+=1\n+ '[' 1 -eq 30 ']'\n+ sleep 1\n+ rc=0\n+ ip addr add 192.0.2.1/24 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ ip -6 addr add 2001:DB8::1/32 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ ip addr show testbr\n+ grep -q 'inet [1-9]'\n+ grep 'release 6' /etc/redhat-release\n+ ip link set test1p master testbr\n+ ip link set test2p master testbr\n+ systemctl is-active firewalld\ninactive\n+ dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/<<< 13273 1726853291.23028: stdout chunk (state=3): >>>show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "start": "2024-09-20 13:28:10.007836", "end": "2024-09-20 13:28:11.228335", "delta": "0:00:01.220499", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 13273 1726853291.24778: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 13273 1726853291.24783: stdout chunk (state=3): >>><<< 13273 1726853291.24785: stderr chunk (state=3): >>><<< 13273 1726853291.24912: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "+ exec\n+ ip link add test1 type veth peer name test1p\n+ ip link add test2 type veth peer name test2p\n++ pgrep NetworkManager\n+ '[' -n 705 ']'\n+ nmcli d set test1 managed true\n+ nmcli d set test2 managed true\n+ nmcli d set test1p managed false\n+ nmcli d set test2p managed false\n+ ip link set test1p up\n+ ip link set test2p up\n+ ip link add name testbr type bridge forward_delay 0\n++ pgrep NetworkManager\n+ '[' -n 705 ']'\n+ nmcli d set testbr managed false\n+ ip link set testbr up\n+ timer=0\n+ ip addr show testbr\n+ grep -q 'inet [1-9]'\n+ let timer+=1\n+ '[' 1 -eq 30 ']'\n+ sleep 1\n+ rc=0\n+ ip addr add 192.0.2.1/24 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ ip -6 addr add 2001:DB8::1/32 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ ip addr show testbr\n+ grep -q 'inet [1-9]'\n+ grep 'release 6' /etc/redhat-release\n+ ip link set test1p master testbr\n+ ip link set test2p master testbr\n+ systemctl is-active firewalld\ninactive\n+ dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "start": "2024-09-20 13:28:10.007836", "end": "2024-09-20 13:28:11.228335", "delta": "0:00:01.220499", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 13273 1726853291.24925: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n "$(pgrep NetworkManager)" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the \'testbr\' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n "$(pgrep NetworkManager)" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q \'inet [1-9]\'\ndo\n let "timer+=1"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc="$?"\n if [ "$rc" != 0 ]; then\n echo NOTICE - could not add testbr - error code "$rc"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc="$?"\n if [ "$rc" != 0 ]; then\n echo NOTICE - could not add testbr - error code "$rc"\n continue\n fi\ndone\n\nif grep \'release 6\' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo \'interface testbr {\' > /etc/radvd.conf\n echo \' AdvSendAdvert on;\' >> /etc/radvd.conf\n echo \' prefix 2001:DB8::/64 { \' >> /etc/radvd.conf\n echo \' AdvOnLink on; }; \' >> /etc/radvd.conf\n echo \' }; \' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service="$service"; then\n firewall-cmd --add-service "$service"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853289.6076128-13705-188326370632909/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13273 1726853291.24965: _low_level_execute_command(): starting 13273 1726853291.24978: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853289.6076128-13705-188326370632909/ > /dev/null 2>&1 && sleep 0' 13273 1726853291.25722: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13273 1726853291.25811: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853291.25865: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853291.25884: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853291.25914: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853291.26003: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853291.27978: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853291.28014: stderr chunk (state=3): >>><<< 13273 1726853291.28017: stdout chunk (state=3): >>><<< 13273 1726853291.28076: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853291.28079: handler run complete 13273 1726853291.28081: Evaluated conditional (False): False 13273 1726853291.28100: attempt loop complete, returning result 13273 1726853291.28103: _execute() done 13273 1726853291.28106: dumping result to json 13273 1726853291.28113: done dumping result, returning 13273 1726853291.28122: done running TaskExecutor() for managed_node3/TASK: Create test interfaces [02083763-bbaf-5fc3-657d-000000000012] 13273 1726853291.28124: sending task result for task 02083763-bbaf-5fc3-657d-000000000012 13273 1726853291.28341: done sending task result for task 02083763-bbaf-5fc3-657d-000000000012 13273 1726853291.28346: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "delta": "0:00:01.220499", "end": "2024-09-20 13:28:11.228335", "rc": 0, "start": "2024-09-20 13:28:10.007836" } STDERR: + exec + ip link add test1 type veth peer name test1p + ip link add test2 type veth peer name test2p ++ pgrep NetworkManager + '[' -n 705 ']' + nmcli d set test1 managed true + nmcli d set test2 managed true + nmcli d set test1p managed false + nmcli d set test2p managed false + ip link set test1p up + ip link set test2p up + ip link add name testbr type bridge forward_delay 0 ++ pgrep NetworkManager + '[' -n 705 ']' + nmcli d set testbr managed false + ip link set testbr up + timer=0 + ip addr show testbr + grep -q 'inet [1-9]' + let timer+=1 + '[' 1 -eq 30 ']' + sleep 1 + rc=0 + ip addr add 192.0.2.1/24 dev testbr + '[' 0 '!=' 0 ']' + ip -6 addr add 2001:DB8::1/32 dev testbr + '[' 0 '!=' 0 ']' + ip addr show testbr + grep -q 'inet [1-9]' + grep 'release 6' /etc/redhat-release + ip link set test1p master testbr + ip link set test2p master testbr + systemctl is-active firewalld inactive + dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces 13273 1726853291.28431: no more pending results, returning what we have 13273 1726853291.28434: results queue empty 13273 1726853291.28436: checking for any_errors_fatal 13273 1726853291.28444: done checking for any_errors_fatal 13273 1726853291.28445: checking for max_fail_percentage 13273 1726853291.28446: done checking for max_fail_percentage 13273 1726853291.28447: checking to see if all hosts have failed and the running result is not ok 13273 1726853291.28448: done checking to see if all hosts have failed 13273 1726853291.28448: getting the remaining hosts for this loop 13273 1726853291.28450: done getting the remaining hosts for this loop 13273 1726853291.28455: getting the next task for host managed_node3 13273 1726853291.28463: done getting next task for host managed_node3 13273 1726853291.28466: ^ task is: TASK: Include the task 'get_interface_stat.yml' 13273 1726853291.28470: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853291.28476: getting variables 13273 1726853291.28477: in VariableManager get_vars() 13273 1726853291.28531: Calling all_inventory to load vars for managed_node3 13273 1726853291.28534: Calling groups_inventory to load vars for managed_node3 13273 1726853291.28536: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853291.28546: Calling all_plugins_play to load vars for managed_node3 13273 1726853291.28548: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853291.28551: Calling groups_plugins_play to load vars for managed_node3 13273 1726853291.28960: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853291.29295: done with get_vars() 13273 1726853291.29307: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Friday 20 September 2024 13:28:11 -0400 (0:00:01.739) 0:00:09.183 ****** 13273 1726853291.29411: entering _queue_task() for managed_node3/include_tasks 13273 1726853291.29705: worker is 1 (out of 1 available) 13273 1726853291.29830: exiting _queue_task() for managed_node3/include_tasks 13273 1726853291.29842: done queuing things up, now waiting for results queue to drain 13273 1726853291.29842: waiting for pending results... 13273 1726853291.30018: running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' 13273 1726853291.30151: in run() - task 02083763-bbaf-5fc3-657d-000000000016 13273 1726853291.30161: variable 'ansible_search_path' from source: unknown 13273 1726853291.30176: variable 'ansible_search_path' from source: unknown 13273 1726853291.30260: calling self._execute() 13273 1726853291.30315: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853291.30327: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853291.30342: variable 'omit' from source: magic vars 13273 1726853291.30741: variable 'ansible_distribution_major_version' from source: facts 13273 1726853291.30759: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853291.30770: _execute() done 13273 1726853291.30782: dumping result to json 13273 1726853291.30802: done dumping result, returning 13273 1726853291.30806: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' [02083763-bbaf-5fc3-657d-000000000016] 13273 1726853291.30814: sending task result for task 02083763-bbaf-5fc3-657d-000000000016 13273 1726853291.31060: done sending task result for task 02083763-bbaf-5fc3-657d-000000000016 13273 1726853291.31063: WORKER PROCESS EXITING 13273 1726853291.31093: no more pending results, returning what we have 13273 1726853291.31098: in VariableManager get_vars() 13273 1726853291.31159: Calling all_inventory to load vars for managed_node3 13273 1726853291.31162: Calling groups_inventory to load vars for managed_node3 13273 1726853291.31165: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853291.31180: Calling all_plugins_play to load vars for managed_node3 13273 1726853291.31183: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853291.31186: Calling groups_plugins_play to load vars for managed_node3 13273 1726853291.31511: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853291.31702: done with get_vars() 13273 1726853291.31710: variable 'ansible_search_path' from source: unknown 13273 1726853291.31711: variable 'ansible_search_path' from source: unknown 13273 1726853291.31748: we have included files to process 13273 1726853291.31750: generating all_blocks data 13273 1726853291.31751: done generating all_blocks data 13273 1726853291.31752: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 13273 1726853291.31753: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 13273 1726853291.31755: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 13273 1726853291.31997: done processing included file 13273 1726853291.31999: iterating over new_blocks loaded from include file 13273 1726853291.32001: in VariableManager get_vars() 13273 1726853291.32027: done with get_vars() 13273 1726853291.32029: filtering new block on tags 13273 1726853291.32045: done filtering new block on tags 13273 1726853291.32048: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node3 13273 1726853291.32053: extending task lists for all hosts with included blocks 13273 1726853291.32151: done extending task lists 13273 1726853291.32153: done processing included files 13273 1726853291.32153: results queue empty 13273 1726853291.32154: checking for any_errors_fatal 13273 1726853291.32161: done checking for any_errors_fatal 13273 1726853291.32162: checking for max_fail_percentage 13273 1726853291.32162: done checking for max_fail_percentage 13273 1726853291.32163: checking to see if all hosts have failed and the running result is not ok 13273 1726853291.32164: done checking to see if all hosts have failed 13273 1726853291.32165: getting the remaining hosts for this loop 13273 1726853291.32166: done getting the remaining hosts for this loop 13273 1726853291.32168: getting the next task for host managed_node3 13273 1726853291.32174: done getting next task for host managed_node3 13273 1726853291.32177: ^ task is: TASK: Get stat for interface {{ interface }} 13273 1726853291.32180: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853291.32182: getting variables 13273 1726853291.32183: in VariableManager get_vars() 13273 1726853291.32202: Calling all_inventory to load vars for managed_node3 13273 1726853291.32204: Calling groups_inventory to load vars for managed_node3 13273 1726853291.32211: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853291.32216: Calling all_plugins_play to load vars for managed_node3 13273 1726853291.32218: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853291.32223: Calling groups_plugins_play to load vars for managed_node3 13273 1726853291.32395: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853291.32588: done with get_vars() 13273 1726853291.32597: done getting variables 13273 1726853291.32756: variable 'interface' from source: task vars 13273 1726853291.32762: variable 'dhcp_interface1' from source: play vars 13273 1726853291.32826: variable 'dhcp_interface1' from source: play vars TASK [Get stat for interface test1] ******************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 13:28:11 -0400 (0:00:00.034) 0:00:09.217 ****** 13273 1726853291.32872: entering _queue_task() for managed_node3/stat 13273 1726853291.33184: worker is 1 (out of 1 available) 13273 1726853291.33307: exiting _queue_task() for managed_node3/stat 13273 1726853291.33319: done queuing things up, now waiting for results queue to drain 13273 1726853291.33319: waiting for pending results... 13273 1726853291.33477: running TaskExecutor() for managed_node3/TASK: Get stat for interface test1 13273 1726853291.33628: in run() - task 02083763-bbaf-5fc3-657d-000000000248 13273 1726853291.33633: variable 'ansible_search_path' from source: unknown 13273 1726853291.33635: variable 'ansible_search_path' from source: unknown 13273 1726853291.33663: calling self._execute() 13273 1726853291.33762: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853291.33845: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853291.33849: variable 'omit' from source: magic vars 13273 1726853291.34141: variable 'ansible_distribution_major_version' from source: facts 13273 1726853291.34157: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853291.34173: variable 'omit' from source: magic vars 13273 1726853291.34236: variable 'omit' from source: magic vars 13273 1726853291.34345: variable 'interface' from source: task vars 13273 1726853291.34357: variable 'dhcp_interface1' from source: play vars 13273 1726853291.34436: variable 'dhcp_interface1' from source: play vars 13273 1726853291.34461: variable 'omit' from source: magic vars 13273 1726853291.34514: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13273 1726853291.34558: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13273 1726853291.34586: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13273 1726853291.34711: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853291.34715: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853291.34717: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13273 1726853291.34720: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853291.34722: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853291.34783: Set connection var ansible_connection to ssh 13273 1726853291.34798: Set connection var ansible_module_compression to ZIP_DEFLATED 13273 1726853291.34809: Set connection var ansible_shell_executable to /bin/sh 13273 1726853291.34821: Set connection var ansible_shell_type to sh 13273 1726853291.34836: Set connection var ansible_pipelining to False 13273 1726853291.34847: Set connection var ansible_timeout to 10 13273 1726853291.34880: variable 'ansible_shell_executable' from source: unknown 13273 1726853291.34889: variable 'ansible_connection' from source: unknown 13273 1726853291.34897: variable 'ansible_module_compression' from source: unknown 13273 1726853291.34903: variable 'ansible_shell_type' from source: unknown 13273 1726853291.34909: variable 'ansible_shell_executable' from source: unknown 13273 1726853291.34928: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853291.34931: variable 'ansible_pipelining' from source: unknown 13273 1726853291.34934: variable 'ansible_timeout' from source: unknown 13273 1726853291.35037: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853291.35169: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 13273 1726853291.35187: variable 'omit' from source: magic vars 13273 1726853291.35198: starting attempt loop 13273 1726853291.35205: running the handler 13273 1726853291.35224: _low_level_execute_command(): starting 13273 1726853291.35239: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13273 1726853291.36023: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 13273 1726853291.36073: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853291.36137: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853291.36157: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853291.36188: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853291.36293: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853291.38013: stdout chunk (state=3): >>>/root <<< 13273 1726853291.38164: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853291.38168: stdout chunk (state=3): >>><<< 13273 1726853291.38173: stderr chunk (state=3): >>><<< 13273 1726853291.38195: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853291.38218: _low_level_execute_command(): starting 13273 1726853291.38307: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853291.3820324-13804-90411741180653 `" && echo ansible-tmp-1726853291.3820324-13804-90411741180653="` echo /root/.ansible/tmp/ansible-tmp-1726853291.3820324-13804-90411741180653 `" ) && sleep 0' 13273 1726853291.38852: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13273 1726853291.38869: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853291.38886: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853291.38998: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853291.39029: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853291.39125: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853291.41093: stdout chunk (state=3): >>>ansible-tmp-1726853291.3820324-13804-90411741180653=/root/.ansible/tmp/ansible-tmp-1726853291.3820324-13804-90411741180653 <<< 13273 1726853291.41248: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853291.41283: stdout chunk (state=3): >>><<< 13273 1726853291.41286: stderr chunk (state=3): >>><<< 13273 1726853291.41477: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853291.3820324-13804-90411741180653=/root/.ansible/tmp/ansible-tmp-1726853291.3820324-13804-90411741180653 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853291.41480: variable 'ansible_module_compression' from source: unknown 13273 1726853291.41483: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13273pukop8ph/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 13273 1726853291.41486: variable 'ansible_facts' from source: unknown 13273 1726853291.41574: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853291.3820324-13804-90411741180653/AnsiballZ_stat.py 13273 1726853291.41730: Sending initial data 13273 1726853291.41885: Sent initial data (152 bytes) 13273 1726853291.42655: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13273 1726853291.42675: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853291.42704: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13273 1726853291.42786: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853291.42809: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853291.42823: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853291.42846: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853291.42949: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853291.44632: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13273 1726853291.44711: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13273 1726853291.44801: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13273pukop8ph/tmpcsqyrrje /root/.ansible/tmp/ansible-tmp-1726853291.3820324-13804-90411741180653/AnsiballZ_stat.py <<< 13273 1726853291.44804: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853291.3820324-13804-90411741180653/AnsiballZ_stat.py" <<< 13273 1726853291.44850: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13273pukop8ph/tmpcsqyrrje" to remote "/root/.ansible/tmp/ansible-tmp-1726853291.3820324-13804-90411741180653/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853291.3820324-13804-90411741180653/AnsiballZ_stat.py" <<< 13273 1726853291.45909: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853291.45912: stdout chunk (state=3): >>><<< 13273 1726853291.45914: stderr chunk (state=3): >>><<< 13273 1726853291.45916: done transferring module to remote 13273 1726853291.45918: _low_level_execute_command(): starting 13273 1726853291.45921: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853291.3820324-13804-90411741180653/ /root/.ansible/tmp/ansible-tmp-1726853291.3820324-13804-90411741180653/AnsiballZ_stat.py && sleep 0' 13273 1726853291.46489: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853291.46519: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853291.46538: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853291.46549: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853291.46641: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853291.48526: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853291.48585: stderr chunk (state=3): >>><<< 13273 1726853291.48598: stdout chunk (state=3): >>><<< 13273 1726853291.48625: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853291.48641: _low_level_execute_command(): starting 13273 1726853291.48655: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853291.3820324-13804-90411741180653/AnsiballZ_stat.py && sleep 0' 13273 1726853291.49293: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13273 1726853291.49307: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853291.49388: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853291.49444: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853291.49465: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853291.49482: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853291.49576: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853291.65533: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test1", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 28714, "dev": 23, "nlink": 1, "atime": 1726853290.016378, "mtime": 1726853290.016378, "ctime": 1726853290.016378, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test1", "lnk_target": "../../devices/virtual/net/test1", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test1", "follow": false, "checksum_algorithm": "sha1"}}} <<< 13273 1726853291.67008: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 13273 1726853291.67013: stdout chunk (state=3): >>><<< 13273 1726853291.67016: stderr chunk (state=3): >>><<< 13273 1726853291.67077: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test1", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 28714, "dev": 23, "nlink": 1, "atime": 1726853290.016378, "mtime": 1726853290.016378, "ctime": 1726853290.016378, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test1", "lnk_target": "../../devices/virtual/net/test1", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test1", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 13273 1726853291.67132: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/test1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853291.3820324-13804-90411741180653/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13273 1726853291.67179: _low_level_execute_command(): starting 13273 1726853291.67182: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853291.3820324-13804-90411741180653/ > /dev/null 2>&1 && sleep 0' 13273 1726853291.68062: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13273 1726853291.68074: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853291.68085: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853291.68208: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853291.68212: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853291.68215: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853291.68285: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853291.70376: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853291.70380: stdout chunk (state=3): >>><<< 13273 1726853291.70382: stderr chunk (state=3): >>><<< 13273 1726853291.70387: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853291.70393: handler run complete 13273 1726853291.70500: attempt loop complete, returning result 13273 1726853291.70503: _execute() done 13273 1726853291.70506: dumping result to json 13273 1726853291.70511: done dumping result, returning 13273 1726853291.70520: done running TaskExecutor() for managed_node3/TASK: Get stat for interface test1 [02083763-bbaf-5fc3-657d-000000000248] 13273 1726853291.70525: sending task result for task 02083763-bbaf-5fc3-657d-000000000248 13273 1726853291.70665: done sending task result for task 02083763-bbaf-5fc3-657d-000000000248 13273 1726853291.70668: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "stat": { "atime": 1726853290.016378, "block_size": 4096, "blocks": 0, "ctime": 1726853290.016378, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 28714, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/test1", "lnk_target": "../../devices/virtual/net/test1", "mode": "0777", "mtime": 1726853290.016378, "nlink": 1, "path": "/sys/class/net/test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 13273 1726853291.70765: no more pending results, returning what we have 13273 1726853291.70768: results queue empty 13273 1726853291.70769: checking for any_errors_fatal 13273 1726853291.70770: done checking for any_errors_fatal 13273 1726853291.70772: checking for max_fail_percentage 13273 1726853291.70774: done checking for max_fail_percentage 13273 1726853291.70775: checking to see if all hosts have failed and the running result is not ok 13273 1726853291.70775: done checking to see if all hosts have failed 13273 1726853291.70776: getting the remaining hosts for this loop 13273 1726853291.70778: done getting the remaining hosts for this loop 13273 1726853291.70781: getting the next task for host managed_node3 13273 1726853291.70787: done getting next task for host managed_node3 13273 1726853291.70789: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 13273 1726853291.70792: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853291.70799: getting variables 13273 1726853291.70800: in VariableManager get_vars() 13273 1726853291.70854: Calling all_inventory to load vars for managed_node3 13273 1726853291.70857: Calling groups_inventory to load vars for managed_node3 13273 1726853291.70859: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853291.70869: Calling all_plugins_play to load vars for managed_node3 13273 1726853291.71033: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853291.71040: Calling groups_plugins_play to load vars for managed_node3 13273 1726853291.71263: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853291.71480: done with get_vars() 13273 1726853291.71493: done getting variables 13273 1726853291.71593: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) 13273 1726853291.71716: variable 'interface' from source: task vars 13273 1726853291.71720: variable 'dhcp_interface1' from source: play vars 13273 1726853291.71785: variable 'dhcp_interface1' from source: play vars TASK [Assert that the interface is present - 'test1'] ************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Friday 20 September 2024 13:28:11 -0400 (0:00:00.389) 0:00:09.607 ****** 13273 1726853291.71817: entering _queue_task() for managed_node3/assert 13273 1726853291.71821: Creating lock for assert 13273 1726853291.72111: worker is 1 (out of 1 available) 13273 1726853291.72126: exiting _queue_task() for managed_node3/assert 13273 1726853291.72137: done queuing things up, now waiting for results queue to drain 13273 1726853291.72138: waiting for pending results... 13273 1726853291.72490: running TaskExecutor() for managed_node3/TASK: Assert that the interface is present - 'test1' 13273 1726853291.72504: in run() - task 02083763-bbaf-5fc3-657d-000000000017 13273 1726853291.72509: variable 'ansible_search_path' from source: unknown 13273 1726853291.72512: variable 'ansible_search_path' from source: unknown 13273 1726853291.72536: calling self._execute() 13273 1726853291.72633: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853291.72646: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853291.72661: variable 'omit' from source: magic vars 13273 1726853291.73108: variable 'ansible_distribution_major_version' from source: facts 13273 1726853291.73122: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853291.73131: variable 'omit' from source: magic vars 13273 1726853291.73183: variable 'omit' from source: magic vars 13273 1726853291.73375: variable 'interface' from source: task vars 13273 1726853291.73378: variable 'dhcp_interface1' from source: play vars 13273 1726853291.73380: variable 'dhcp_interface1' from source: play vars 13273 1726853291.73382: variable 'omit' from source: magic vars 13273 1726853291.73408: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13273 1726853291.73446: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13273 1726853291.73466: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13273 1726853291.73493: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853291.73507: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853291.73534: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13273 1726853291.73541: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853291.73548: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853291.73655: Set connection var ansible_connection to ssh 13273 1726853291.73674: Set connection var ansible_module_compression to ZIP_DEFLATED 13273 1726853291.73686: Set connection var ansible_shell_executable to /bin/sh 13273 1726853291.73703: Set connection var ansible_shell_type to sh 13273 1726853291.73713: Set connection var ansible_pipelining to False 13273 1726853291.73722: Set connection var ansible_timeout to 10 13273 1726853291.73750: variable 'ansible_shell_executable' from source: unknown 13273 1726853291.73759: variable 'ansible_connection' from source: unknown 13273 1726853291.73765: variable 'ansible_module_compression' from source: unknown 13273 1726853291.73774: variable 'ansible_shell_type' from source: unknown 13273 1726853291.73782: variable 'ansible_shell_executable' from source: unknown 13273 1726853291.73807: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853291.73811: variable 'ansible_pipelining' from source: unknown 13273 1726853291.73813: variable 'ansible_timeout' from source: unknown 13273 1726853291.73816: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853291.73978: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13273 1726853291.73982: variable 'omit' from source: magic vars 13273 1726853291.73991: starting attempt loop 13273 1726853291.74026: running the handler 13273 1726853291.74141: variable 'interface_stat' from source: set_fact 13273 1726853291.74164: Evaluated conditional (interface_stat.stat.exists): True 13273 1726853291.74177: handler run complete 13273 1726853291.74244: attempt loop complete, returning result 13273 1726853291.74247: _execute() done 13273 1726853291.74250: dumping result to json 13273 1726853291.74252: done dumping result, returning 13273 1726853291.74254: done running TaskExecutor() for managed_node3/TASK: Assert that the interface is present - 'test1' [02083763-bbaf-5fc3-657d-000000000017] 13273 1726853291.74256: sending task result for task 02083763-bbaf-5fc3-657d-000000000017 ok: [managed_node3] => { "changed": false } MSG: All assertions passed 13273 1726853291.74397: no more pending results, returning what we have 13273 1726853291.74401: results queue empty 13273 1726853291.74402: checking for any_errors_fatal 13273 1726853291.74410: done checking for any_errors_fatal 13273 1726853291.74411: checking for max_fail_percentage 13273 1726853291.74413: done checking for max_fail_percentage 13273 1726853291.74414: checking to see if all hosts have failed and the running result is not ok 13273 1726853291.74415: done checking to see if all hosts have failed 13273 1726853291.74416: getting the remaining hosts for this loop 13273 1726853291.74417: done getting the remaining hosts for this loop 13273 1726853291.74420: getting the next task for host managed_node3 13273 1726853291.74429: done getting next task for host managed_node3 13273 1726853291.74431: ^ task is: TASK: Include the task 'get_interface_stat.yml' 13273 1726853291.74435: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853291.74438: getting variables 13273 1726853291.74440: in VariableManager get_vars() 13273 1726853291.74498: Calling all_inventory to load vars for managed_node3 13273 1726853291.74501: Calling groups_inventory to load vars for managed_node3 13273 1726853291.74504: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853291.74515: Calling all_plugins_play to load vars for managed_node3 13273 1726853291.74518: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853291.74521: Calling groups_plugins_play to load vars for managed_node3 13273 1726853291.75151: done sending task result for task 02083763-bbaf-5fc3-657d-000000000017 13273 1726853291.75154: WORKER PROCESS EXITING 13273 1726853291.75161: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853291.75368: done with get_vars() 13273 1726853291.75380: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Friday 20 September 2024 13:28:11 -0400 (0:00:00.036) 0:00:09.643 ****** 13273 1726853291.75475: entering _queue_task() for managed_node3/include_tasks 13273 1726853291.75723: worker is 1 (out of 1 available) 13273 1726853291.75853: exiting _queue_task() for managed_node3/include_tasks 13273 1726853291.75864: done queuing things up, now waiting for results queue to drain 13273 1726853291.75865: waiting for pending results... 13273 1726853291.76091: running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' 13273 1726853291.76163: in run() - task 02083763-bbaf-5fc3-657d-00000000001b 13273 1726853291.76193: variable 'ansible_search_path' from source: unknown 13273 1726853291.76294: variable 'ansible_search_path' from source: unknown 13273 1726853291.76297: calling self._execute() 13273 1726853291.76344: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853291.76357: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853291.76374: variable 'omit' from source: magic vars 13273 1726853291.76761: variable 'ansible_distribution_major_version' from source: facts 13273 1726853291.76782: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853291.76793: _execute() done 13273 1726853291.76802: dumping result to json 13273 1726853291.76809: done dumping result, returning 13273 1726853291.76820: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' [02083763-bbaf-5fc3-657d-00000000001b] 13273 1726853291.76830: sending task result for task 02083763-bbaf-5fc3-657d-00000000001b 13273 1726853291.77083: no more pending results, returning what we have 13273 1726853291.77089: in VariableManager get_vars() 13273 1726853291.77149: Calling all_inventory to load vars for managed_node3 13273 1726853291.77153: Calling groups_inventory to load vars for managed_node3 13273 1726853291.77156: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853291.77300: done sending task result for task 02083763-bbaf-5fc3-657d-00000000001b 13273 1726853291.77304: WORKER PROCESS EXITING 13273 1726853291.77313: Calling all_plugins_play to load vars for managed_node3 13273 1726853291.77316: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853291.77319: Calling groups_plugins_play to load vars for managed_node3 13273 1726853291.77497: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853291.77720: done with get_vars() 13273 1726853291.77732: variable 'ansible_search_path' from source: unknown 13273 1726853291.77734: variable 'ansible_search_path' from source: unknown 13273 1726853291.77768: we have included files to process 13273 1726853291.77770: generating all_blocks data 13273 1726853291.77775: done generating all_blocks data 13273 1726853291.77779: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 13273 1726853291.77780: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 13273 1726853291.77783: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 13273 1726853291.77968: done processing included file 13273 1726853291.77970: iterating over new_blocks loaded from include file 13273 1726853291.77973: in VariableManager get_vars() 13273 1726853291.77999: done with get_vars() 13273 1726853291.78000: filtering new block on tags 13273 1726853291.78014: done filtering new block on tags 13273 1726853291.78016: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node3 13273 1726853291.78020: extending task lists for all hosts with included blocks 13273 1726853291.78111: done extending task lists 13273 1726853291.78112: done processing included files 13273 1726853291.78112: results queue empty 13273 1726853291.78113: checking for any_errors_fatal 13273 1726853291.78116: done checking for any_errors_fatal 13273 1726853291.78117: checking for max_fail_percentage 13273 1726853291.78118: done checking for max_fail_percentage 13273 1726853291.78118: checking to see if all hosts have failed and the running result is not ok 13273 1726853291.78119: done checking to see if all hosts have failed 13273 1726853291.78120: getting the remaining hosts for this loop 13273 1726853291.78121: done getting the remaining hosts for this loop 13273 1726853291.78123: getting the next task for host managed_node3 13273 1726853291.78126: done getting next task for host managed_node3 13273 1726853291.78128: ^ task is: TASK: Get stat for interface {{ interface }} 13273 1726853291.78130: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853291.78132: getting variables 13273 1726853291.78133: in VariableManager get_vars() 13273 1726853291.78149: Calling all_inventory to load vars for managed_node3 13273 1726853291.78151: Calling groups_inventory to load vars for managed_node3 13273 1726853291.78152: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853291.78157: Calling all_plugins_play to load vars for managed_node3 13273 1726853291.78160: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853291.78166: Calling groups_plugins_play to load vars for managed_node3 13273 1726853291.78305: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853291.78500: done with get_vars() 13273 1726853291.78508: done getting variables 13273 1726853291.78648: variable 'interface' from source: task vars 13273 1726853291.78651: variable 'dhcp_interface2' from source: play vars 13273 1726853291.78710: variable 'dhcp_interface2' from source: play vars TASK [Get stat for interface test2] ******************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 13:28:11 -0400 (0:00:00.032) 0:00:09.676 ****** 13273 1726853291.78739: entering _queue_task() for managed_node3/stat 13273 1726853291.79204: worker is 1 (out of 1 available) 13273 1726853291.79213: exiting _queue_task() for managed_node3/stat 13273 1726853291.79223: done queuing things up, now waiting for results queue to drain 13273 1726853291.79224: waiting for pending results... 13273 1726853291.79302: running TaskExecutor() for managed_node3/TASK: Get stat for interface test2 13273 1726853291.79435: in run() - task 02083763-bbaf-5fc3-657d-000000000260 13273 1726853291.79463: variable 'ansible_search_path' from source: unknown 13273 1726853291.79474: variable 'ansible_search_path' from source: unknown 13273 1726853291.79512: calling self._execute() 13273 1726853291.79604: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853291.79616: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853291.79630: variable 'omit' from source: magic vars 13273 1726853291.79992: variable 'ansible_distribution_major_version' from source: facts 13273 1726853291.80014: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853291.80025: variable 'omit' from source: magic vars 13273 1726853291.80082: variable 'omit' from source: magic vars 13273 1726853291.80214: variable 'interface' from source: task vars 13273 1726853291.80217: variable 'dhcp_interface2' from source: play vars 13273 1726853291.80242: variable 'dhcp_interface2' from source: play vars 13273 1726853291.80263: variable 'omit' from source: magic vars 13273 1726853291.80307: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13273 1726853291.80353: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13273 1726853291.80378: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13273 1726853291.80399: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853291.80431: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853291.80455: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13273 1726853291.80464: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853291.80541: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853291.80573: Set connection var ansible_connection to ssh 13273 1726853291.80588: Set connection var ansible_module_compression to ZIP_DEFLATED 13273 1726853291.80598: Set connection var ansible_shell_executable to /bin/sh 13273 1726853291.80604: Set connection var ansible_shell_type to sh 13273 1726853291.80612: Set connection var ansible_pipelining to False 13273 1726853291.80622: Set connection var ansible_timeout to 10 13273 1726853291.80724: variable 'ansible_shell_executable' from source: unknown 13273 1726853291.80734: variable 'ansible_connection' from source: unknown 13273 1726853291.80741: variable 'ansible_module_compression' from source: unknown 13273 1726853291.80748: variable 'ansible_shell_type' from source: unknown 13273 1726853291.80764: variable 'ansible_shell_executable' from source: unknown 13273 1726853291.80775: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853291.80783: variable 'ansible_pipelining' from source: unknown 13273 1726853291.80791: variable 'ansible_timeout' from source: unknown 13273 1726853291.80868: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853291.81001: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 13273 1726853291.81014: variable 'omit' from source: magic vars 13273 1726853291.81025: starting attempt loop 13273 1726853291.81030: running the handler 13273 1726853291.81046: _low_level_execute_command(): starting 13273 1726853291.81058: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13273 1726853291.81830: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13273 1726853291.81889: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853291.81894: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853291.81970: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853291.81997: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853291.82023: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853291.82043: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853291.82152: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853291.83882: stdout chunk (state=3): >>>/root <<< 13273 1726853291.84045: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853291.84049: stdout chunk (state=3): >>><<< 13273 1726853291.84051: stderr chunk (state=3): >>><<< 13273 1726853291.84077: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853291.84180: _low_level_execute_command(): starting 13273 1726853291.84183: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853291.8408568-13832-38701325752612 `" && echo ansible-tmp-1726853291.8408568-13832-38701325752612="` echo /root/.ansible/tmp/ansible-tmp-1726853291.8408568-13832-38701325752612 `" ) && sleep 0' 13273 1726853291.84821: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13273 1726853291.84878: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853291.84881: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853291.84891: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13273 1726853291.84954: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853291.84973: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853291.85059: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853291.87062: stdout chunk (state=3): >>>ansible-tmp-1726853291.8408568-13832-38701325752612=/root/.ansible/tmp/ansible-tmp-1726853291.8408568-13832-38701325752612 <<< 13273 1726853291.87251: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853291.87254: stdout chunk (state=3): >>><<< 13273 1726853291.87256: stderr chunk (state=3): >>><<< 13273 1726853291.87480: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853291.8408568-13832-38701325752612=/root/.ansible/tmp/ansible-tmp-1726853291.8408568-13832-38701325752612 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853291.87483: variable 'ansible_module_compression' from source: unknown 13273 1726853291.87486: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13273pukop8ph/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 13273 1726853291.87489: variable 'ansible_facts' from source: unknown 13273 1726853291.87534: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853291.8408568-13832-38701325752612/AnsiballZ_stat.py 13273 1726853291.87719: Sending initial data 13273 1726853291.87728: Sent initial data (152 bytes) 13273 1726853291.88388: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13273 1726853291.88403: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853291.88481: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853291.88535: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853291.88552: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853291.88586: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853291.88691: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853291.90462: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13273 1726853291.90539: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13273 1726853291.90602: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13273pukop8ph/tmp3sb56zkp /root/.ansible/tmp/ansible-tmp-1726853291.8408568-13832-38701325752612/AnsiballZ_stat.py <<< 13273 1726853291.90613: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853291.8408568-13832-38701325752612/AnsiballZ_stat.py" <<< 13273 1726853291.90666: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13273pukop8ph/tmp3sb56zkp" to remote "/root/.ansible/tmp/ansible-tmp-1726853291.8408568-13832-38701325752612/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853291.8408568-13832-38701325752612/AnsiballZ_stat.py" <<< 13273 1726853291.91657: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853291.91662: stderr chunk (state=3): >>><<< 13273 1726853291.91665: stdout chunk (state=3): >>><<< 13273 1726853291.91864: done transferring module to remote 13273 1726853291.91867: _low_level_execute_command(): starting 13273 1726853291.91870: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853291.8408568-13832-38701325752612/ /root/.ansible/tmp/ansible-tmp-1726853291.8408568-13832-38701325752612/AnsiballZ_stat.py && sleep 0' 13273 1726853291.92856: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853291.92977: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853291.92980: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853291.93104: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853291.94992: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853291.95027: stderr chunk (state=3): >>><<< 13273 1726853291.95054: stdout chunk (state=3): >>><<< 13273 1726853291.95078: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853291.95088: _low_level_execute_command(): starting 13273 1726853291.95098: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853291.8408568-13832-38701325752612/AnsiballZ_stat.py && sleep 0' 13273 1726853291.95724: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13273 1726853291.95736: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853291.95748: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853291.95784: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853291.95794: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13273 1726853291.95803: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 13273 1726853291.95811: stderr chunk (state=3): >>>debug2: match found <<< 13273 1726853291.95834: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853291.95904: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853291.95924: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853291.96280: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853292.11820: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test2", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 29120, "dev": 23, "nlink": 1, "atime": 1726853290.022301, "mtime": 1726853290.022301, "ctime": 1726853290.022301, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test2", "lnk_target": "../../devices/virtual/net/test2", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test2", "follow": false, "checksum_algorithm": "sha1"}}} <<< 13273 1726853292.13209: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 13273 1726853292.13214: stdout chunk (state=3): >>><<< 13273 1726853292.13217: stderr chunk (state=3): >>><<< 13273 1726853292.13492: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test2", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 29120, "dev": 23, "nlink": 1, "atime": 1726853290.022301, "mtime": 1726853290.022301, "ctime": 1726853290.022301, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test2", "lnk_target": "../../devices/virtual/net/test2", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test2", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 13273 1726853292.13496: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/test2', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853291.8408568-13832-38701325752612/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13273 1726853292.13502: _low_level_execute_command(): starting 13273 1726853292.13505: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853291.8408568-13832-38701325752612/ > /dev/null 2>&1 && sleep 0' 13273 1726853292.15096: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853292.15416: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853292.15522: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853292.17526: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853292.17530: stdout chunk (state=3): >>><<< 13273 1726853292.17532: stderr chunk (state=3): >>><<< 13273 1726853292.17535: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853292.17537: handler run complete 13273 1726853292.17853: attempt loop complete, returning result 13273 1726853292.17857: _execute() done 13273 1726853292.17859: dumping result to json 13273 1726853292.17862: done dumping result, returning 13273 1726853292.17864: done running TaskExecutor() for managed_node3/TASK: Get stat for interface test2 [02083763-bbaf-5fc3-657d-000000000260] 13273 1726853292.17866: sending task result for task 02083763-bbaf-5fc3-657d-000000000260 13273 1726853292.18147: done sending task result for task 02083763-bbaf-5fc3-657d-000000000260 13273 1726853292.18151: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "stat": { "atime": 1726853290.022301, "block_size": 4096, "blocks": 0, "ctime": 1726853290.022301, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 29120, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/test2", "lnk_target": "../../devices/virtual/net/test2", "mode": "0777", "mtime": 1726853290.022301, "nlink": 1, "path": "/sys/class/net/test2", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 13273 1726853292.18275: no more pending results, returning what we have 13273 1726853292.18278: results queue empty 13273 1726853292.18280: checking for any_errors_fatal 13273 1726853292.18281: done checking for any_errors_fatal 13273 1726853292.18282: checking for max_fail_percentage 13273 1726853292.18283: done checking for max_fail_percentage 13273 1726853292.18284: checking to see if all hosts have failed and the running result is not ok 13273 1726853292.18285: done checking to see if all hosts have failed 13273 1726853292.18285: getting the remaining hosts for this loop 13273 1726853292.18287: done getting the remaining hosts for this loop 13273 1726853292.18290: getting the next task for host managed_node3 13273 1726853292.18297: done getting next task for host managed_node3 13273 1726853292.18300: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 13273 1726853292.18303: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853292.18307: getting variables 13273 1726853292.18308: in VariableManager get_vars() 13273 1726853292.18362: Calling all_inventory to load vars for managed_node3 13273 1726853292.18365: Calling groups_inventory to load vars for managed_node3 13273 1726853292.18368: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853292.18782: Calling all_plugins_play to load vars for managed_node3 13273 1726853292.18785: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853292.18789: Calling groups_plugins_play to load vars for managed_node3 13273 1726853292.19617: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853292.20230: done with get_vars() 13273 1726853292.20244: done getting variables 13273 1726853292.20306: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 13273 1726853292.20829: variable 'interface' from source: task vars 13273 1726853292.20833: variable 'dhcp_interface2' from source: play vars 13273 1726853292.20898: variable 'dhcp_interface2' from source: play vars TASK [Assert that the interface is present - 'test2'] ************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Friday 20 September 2024 13:28:12 -0400 (0:00:00.421) 0:00:10.098 ****** 13273 1726853292.20929: entering _queue_task() for managed_node3/assert 13273 1726853292.22057: worker is 1 (out of 1 available) 13273 1726853292.22070: exiting _queue_task() for managed_node3/assert 13273 1726853292.22084: done queuing things up, now waiting for results queue to drain 13273 1726853292.22085: waiting for pending results... 13273 1726853292.22791: running TaskExecutor() for managed_node3/TASK: Assert that the interface is present - 'test2' 13273 1726853292.22800: in run() - task 02083763-bbaf-5fc3-657d-00000000001c 13273 1726853292.22805: variable 'ansible_search_path' from source: unknown 13273 1726853292.22807: variable 'ansible_search_path' from source: unknown 13273 1726853292.23177: calling self._execute() 13273 1726853292.23181: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853292.23183: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853292.23185: variable 'omit' from source: magic vars 13273 1726853292.23695: variable 'ansible_distribution_major_version' from source: facts 13273 1726853292.23711: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853292.23722: variable 'omit' from source: magic vars 13273 1726853292.23773: variable 'omit' from source: magic vars 13273 1726853292.23874: variable 'interface' from source: task vars 13273 1726853292.23889: variable 'dhcp_interface2' from source: play vars 13273 1726853292.23953: variable 'dhcp_interface2' from source: play vars 13273 1726853292.23981: variable 'omit' from source: magic vars 13273 1726853292.24028: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13273 1726853292.24067: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13273 1726853292.24095: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13273 1726853292.24120: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853292.24184: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853292.24275: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13273 1726853292.24285: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853292.24477: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853292.24481: Set connection var ansible_connection to ssh 13273 1726853292.24547: Set connection var ansible_module_compression to ZIP_DEFLATED 13273 1726853292.24560: Set connection var ansible_shell_executable to /bin/sh 13273 1726853292.24567: Set connection var ansible_shell_type to sh 13273 1726853292.24580: Set connection var ansible_pipelining to False 13273 1726853292.24677: Set connection var ansible_timeout to 10 13273 1726853292.24694: variable 'ansible_shell_executable' from source: unknown 13273 1726853292.24702: variable 'ansible_connection' from source: unknown 13273 1726853292.24709: variable 'ansible_module_compression' from source: unknown 13273 1726853292.24716: variable 'ansible_shell_type' from source: unknown 13273 1726853292.24722: variable 'ansible_shell_executable' from source: unknown 13273 1726853292.24730: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853292.24737: variable 'ansible_pipelining' from source: unknown 13273 1726853292.24764: variable 'ansible_timeout' from source: unknown 13273 1726853292.24777: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853292.25180: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13273 1726853292.25184: variable 'omit' from source: magic vars 13273 1726853292.25186: starting attempt loop 13273 1726853292.25190: running the handler 13273 1726853292.25414: variable 'interface_stat' from source: set_fact 13273 1726853292.25523: Evaluated conditional (interface_stat.stat.exists): True 13273 1726853292.25527: handler run complete 13273 1726853292.25529: attempt loop complete, returning result 13273 1726853292.25531: _execute() done 13273 1726853292.25533: dumping result to json 13273 1726853292.25536: done dumping result, returning 13273 1726853292.25538: done running TaskExecutor() for managed_node3/TASK: Assert that the interface is present - 'test2' [02083763-bbaf-5fc3-657d-00000000001c] 13273 1726853292.25677: sending task result for task 02083763-bbaf-5fc3-657d-00000000001c 13273 1726853292.25921: done sending task result for task 02083763-bbaf-5fc3-657d-00000000001c 13273 1726853292.25924: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 13273 1726853292.25978: no more pending results, returning what we have 13273 1726853292.25982: results queue empty 13273 1726853292.25983: checking for any_errors_fatal 13273 1726853292.25989: done checking for any_errors_fatal 13273 1726853292.25990: checking for max_fail_percentage 13273 1726853292.25991: done checking for max_fail_percentage 13273 1726853292.25992: checking to see if all hosts have failed and the running result is not ok 13273 1726853292.25993: done checking to see if all hosts have failed 13273 1726853292.25994: getting the remaining hosts for this loop 13273 1726853292.25995: done getting the remaining hosts for this loop 13273 1726853292.25998: getting the next task for host managed_node3 13273 1726853292.26005: done getting next task for host managed_node3 13273 1726853292.26007: ^ task is: TASK: Backup the /etc/resolv.conf for initscript 13273 1726853292.26010: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853292.26013: getting variables 13273 1726853292.26015: in VariableManager get_vars() 13273 1726853292.26075: Calling all_inventory to load vars for managed_node3 13273 1726853292.26097: Calling groups_inventory to load vars for managed_node3 13273 1726853292.26101: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853292.26112: Calling all_plugins_play to load vars for managed_node3 13273 1726853292.26115: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853292.26118: Calling groups_plugins_play to load vars for managed_node3 13273 1726853292.26475: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853292.26673: done with get_vars() 13273 1726853292.26684: done getting variables 13273 1726853292.26738: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Backup the /etc/resolv.conf for initscript] ****************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:28 Friday 20 September 2024 13:28:12 -0400 (0:00:00.058) 0:00:10.156 ****** 13273 1726853292.26763: entering _queue_task() for managed_node3/command 13273 1726853292.27022: worker is 1 (out of 1 available) 13273 1726853292.27033: exiting _queue_task() for managed_node3/command 13273 1726853292.27046: done queuing things up, now waiting for results queue to drain 13273 1726853292.27047: waiting for pending results... 13273 1726853292.27334: running TaskExecutor() for managed_node3/TASK: Backup the /etc/resolv.conf for initscript 13273 1726853292.27478: in run() - task 02083763-bbaf-5fc3-657d-00000000001d 13273 1726853292.27482: variable 'ansible_search_path' from source: unknown 13273 1726853292.27877: calling self._execute() 13273 1726853292.27880: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853292.27883: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853292.27885: variable 'omit' from source: magic vars 13273 1726853292.28587: variable 'ansible_distribution_major_version' from source: facts 13273 1726853292.28602: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853292.28724: variable 'network_provider' from source: set_fact 13273 1726853292.28978: Evaluated conditional (network_provider == "initscripts"): False 13273 1726853292.28982: when evaluation is False, skipping this task 13273 1726853292.28985: _execute() done 13273 1726853292.28987: dumping result to json 13273 1726853292.28989: done dumping result, returning 13273 1726853292.28992: done running TaskExecutor() for managed_node3/TASK: Backup the /etc/resolv.conf for initscript [02083763-bbaf-5fc3-657d-00000000001d] 13273 1726853292.28994: sending task result for task 02083763-bbaf-5fc3-657d-00000000001d 13273 1726853292.29061: done sending task result for task 02083763-bbaf-5fc3-657d-00000000001d 13273 1726853292.29064: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 13273 1726853292.29116: no more pending results, returning what we have 13273 1726853292.29119: results queue empty 13273 1726853292.29120: checking for any_errors_fatal 13273 1726853292.29125: done checking for any_errors_fatal 13273 1726853292.29126: checking for max_fail_percentage 13273 1726853292.29128: done checking for max_fail_percentage 13273 1726853292.29128: checking to see if all hosts have failed and the running result is not ok 13273 1726853292.29129: done checking to see if all hosts have failed 13273 1726853292.29130: getting the remaining hosts for this loop 13273 1726853292.29131: done getting the remaining hosts for this loop 13273 1726853292.29134: getting the next task for host managed_node3 13273 1726853292.29139: done getting next task for host managed_node3 13273 1726853292.29142: ^ task is: TASK: TEST Add Bond with 2 ports 13273 1726853292.29145: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853292.29148: getting variables 13273 1726853292.29150: in VariableManager get_vars() 13273 1726853292.29207: Calling all_inventory to load vars for managed_node3 13273 1726853292.29210: Calling groups_inventory to load vars for managed_node3 13273 1726853292.30129: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853292.30141: Calling all_plugins_play to load vars for managed_node3 13273 1726853292.30144: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853292.30148: Calling groups_plugins_play to load vars for managed_node3 13273 1726853292.30306: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853292.30495: done with get_vars() 13273 1726853292.30506: done getting variables 13273 1726853292.30565: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [TEST Add Bond with 2 ports] ********************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:33 Friday 20 September 2024 13:28:12 -0400 (0:00:00.038) 0:00:10.195 ****** 13273 1726853292.30592: entering _queue_task() for managed_node3/debug 13273 1726853292.30854: worker is 1 (out of 1 available) 13273 1726853292.30866: exiting _queue_task() for managed_node3/debug 13273 1726853292.31079: done queuing things up, now waiting for results queue to drain 13273 1726853292.31081: waiting for pending results... 13273 1726853292.31210: running TaskExecutor() for managed_node3/TASK: TEST Add Bond with 2 ports 13273 1726853292.31241: in run() - task 02083763-bbaf-5fc3-657d-00000000001e 13273 1726853292.31264: variable 'ansible_search_path' from source: unknown 13273 1726853292.31311: calling self._execute() 13273 1726853292.31400: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853292.31416: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853292.31431: variable 'omit' from source: magic vars 13273 1726853292.31804: variable 'ansible_distribution_major_version' from source: facts 13273 1726853292.31823: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853292.31847: variable 'omit' from source: magic vars 13273 1726853292.31860: variable 'omit' from source: magic vars 13273 1726853292.31901: variable 'omit' from source: magic vars 13273 1726853292.31957: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13273 1726853292.31989: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13273 1726853292.32065: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13273 1726853292.32069: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853292.32074: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853292.32085: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13273 1726853292.32094: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853292.32102: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853292.32205: Set connection var ansible_connection to ssh 13273 1726853292.32220: Set connection var ansible_module_compression to ZIP_DEFLATED 13273 1726853292.32229: Set connection var ansible_shell_executable to /bin/sh 13273 1726853292.32235: Set connection var ansible_shell_type to sh 13273 1726853292.32244: Set connection var ansible_pipelining to False 13273 1726853292.32275: Set connection var ansible_timeout to 10 13273 1726853292.32289: variable 'ansible_shell_executable' from source: unknown 13273 1726853292.32296: variable 'ansible_connection' from source: unknown 13273 1726853292.32302: variable 'ansible_module_compression' from source: unknown 13273 1726853292.32310: variable 'ansible_shell_type' from source: unknown 13273 1726853292.32577: variable 'ansible_shell_executable' from source: unknown 13273 1726853292.32581: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853292.32583: variable 'ansible_pipelining' from source: unknown 13273 1726853292.32585: variable 'ansible_timeout' from source: unknown 13273 1726853292.32588: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853292.32590: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13273 1726853292.32593: variable 'omit' from source: magic vars 13273 1726853292.32595: starting attempt loop 13273 1726853292.32598: running the handler 13273 1726853292.32599: handler run complete 13273 1726853292.32601: attempt loop complete, returning result 13273 1726853292.32603: _execute() done 13273 1726853292.32605: dumping result to json 13273 1726853292.32608: done dumping result, returning 13273 1726853292.32611: done running TaskExecutor() for managed_node3/TASK: TEST Add Bond with 2 ports [02083763-bbaf-5fc3-657d-00000000001e] 13273 1726853292.32614: sending task result for task 02083763-bbaf-5fc3-657d-00000000001e 13273 1726853292.32682: done sending task result for task 02083763-bbaf-5fc3-657d-00000000001e ok: [managed_node3] => {} MSG: ################################################## 13273 1726853292.32732: no more pending results, returning what we have 13273 1726853292.32735: results queue empty 13273 1726853292.32736: checking for any_errors_fatal 13273 1726853292.32743: done checking for any_errors_fatal 13273 1726853292.32744: checking for max_fail_percentage 13273 1726853292.32745: done checking for max_fail_percentage 13273 1726853292.32746: checking to see if all hosts have failed and the running result is not ok 13273 1726853292.32747: done checking to see if all hosts have failed 13273 1726853292.32747: getting the remaining hosts for this loop 13273 1726853292.32749: done getting the remaining hosts for this loop 13273 1726853292.32752: getting the next task for host managed_node3 13273 1726853292.32760: done getting next task for host managed_node3 13273 1726853292.32766: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 13273 1726853292.32769: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853292.32788: getting variables 13273 1726853292.32789: in VariableManager get_vars() 13273 1726853292.32843: Calling all_inventory to load vars for managed_node3 13273 1726853292.32847: Calling groups_inventory to load vars for managed_node3 13273 1726853292.32849: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853292.32859: Calling all_plugins_play to load vars for managed_node3 13273 1726853292.32862: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853292.32865: Calling groups_plugins_play to load vars for managed_node3 13273 1726853292.33210: WORKER PROCESS EXITING 13273 1726853292.33230: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853292.33456: done with get_vars() 13273 1726853292.33466: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 13:28:12 -0400 (0:00:00.029) 0:00:10.224 ****** 13273 1726853292.33545: entering _queue_task() for managed_node3/include_tasks 13273 1726853292.33777: worker is 1 (out of 1 available) 13273 1726853292.33789: exiting _queue_task() for managed_node3/include_tasks 13273 1726853292.33801: done queuing things up, now waiting for results queue to drain 13273 1726853292.33802: waiting for pending results... 13273 1726853292.34328: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 13273 1726853292.34707: in run() - task 02083763-bbaf-5fc3-657d-000000000026 13273 1726853292.34815: variable 'ansible_search_path' from source: unknown 13273 1726853292.34819: variable 'ansible_search_path' from source: unknown 13273 1726853292.34822: calling self._execute() 13273 1726853292.34961: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853292.34977: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853292.34992: variable 'omit' from source: magic vars 13273 1726853292.35810: variable 'ansible_distribution_major_version' from source: facts 13273 1726853292.35828: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853292.35840: _execute() done 13273 1726853292.35848: dumping result to json 13273 1726853292.35856: done dumping result, returning 13273 1726853292.35869: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [02083763-bbaf-5fc3-657d-000000000026] 13273 1726853292.35882: sending task result for task 02083763-bbaf-5fc3-657d-000000000026 13273 1726853292.36252: done sending task result for task 02083763-bbaf-5fc3-657d-000000000026 13273 1726853292.36255: WORKER PROCESS EXITING 13273 1726853292.36416: no more pending results, returning what we have 13273 1726853292.36420: in VariableManager get_vars() 13273 1726853292.36486: Calling all_inventory to load vars for managed_node3 13273 1726853292.36489: Calling groups_inventory to load vars for managed_node3 13273 1726853292.36492: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853292.36505: Calling all_plugins_play to load vars for managed_node3 13273 1726853292.36508: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853292.36511: Calling groups_plugins_play to load vars for managed_node3 13273 1726853292.37206: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853292.37617: done with get_vars() 13273 1726853292.37626: variable 'ansible_search_path' from source: unknown 13273 1726853292.37627: variable 'ansible_search_path' from source: unknown 13273 1726853292.37667: we have included files to process 13273 1726853292.37668: generating all_blocks data 13273 1726853292.37670: done generating all_blocks data 13273 1726853292.38077: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 13273 1726853292.38079: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 13273 1726853292.38083: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 13273 1726853292.40022: done processing included file 13273 1726853292.40024: iterating over new_blocks loaded from include file 13273 1726853292.40025: in VariableManager get_vars() 13273 1726853292.40059: done with get_vars() 13273 1726853292.40061: filtering new block on tags 13273 1726853292.40286: done filtering new block on tags 13273 1726853292.40290: in VariableManager get_vars() 13273 1726853292.40320: done with get_vars() 13273 1726853292.40322: filtering new block on tags 13273 1726853292.40342: done filtering new block on tags 13273 1726853292.40344: in VariableManager get_vars() 13273 1726853292.40375: done with get_vars() 13273 1726853292.40377: filtering new block on tags 13273 1726853292.40395: done filtering new block on tags 13273 1726853292.40397: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node3 13273 1726853292.40402: extending task lists for all hosts with included blocks 13273 1726853292.41756: done extending task lists 13273 1726853292.41758: done processing included files 13273 1726853292.41758: results queue empty 13273 1726853292.41759: checking for any_errors_fatal 13273 1726853292.41762: done checking for any_errors_fatal 13273 1726853292.41763: checking for max_fail_percentage 13273 1726853292.41764: done checking for max_fail_percentage 13273 1726853292.41764: checking to see if all hosts have failed and the running result is not ok 13273 1726853292.41765: done checking to see if all hosts have failed 13273 1726853292.41766: getting the remaining hosts for this loop 13273 1726853292.41767: done getting the remaining hosts for this loop 13273 1726853292.41769: getting the next task for host managed_node3 13273 1726853292.41775: done getting next task for host managed_node3 13273 1726853292.41778: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 13273 1726853292.41781: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853292.41790: getting variables 13273 1726853292.41790: in VariableManager get_vars() 13273 1726853292.41809: Calling all_inventory to load vars for managed_node3 13273 1726853292.41812: Calling groups_inventory to load vars for managed_node3 13273 1726853292.41813: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853292.41818: Calling all_plugins_play to load vars for managed_node3 13273 1726853292.41821: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853292.41824: Calling groups_plugins_play to load vars for managed_node3 13273 1726853292.41984: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853292.42186: done with get_vars() 13273 1726853292.42194: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 13:28:12 -0400 (0:00:00.087) 0:00:10.311 ****** 13273 1726853292.42260: entering _queue_task() for managed_node3/setup 13273 1726853292.42562: worker is 1 (out of 1 available) 13273 1726853292.42576: exiting _queue_task() for managed_node3/setup 13273 1726853292.42588: done queuing things up, now waiting for results queue to drain 13273 1726853292.42589: waiting for pending results... 13273 1726853292.42848: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 13273 1726853292.42996: in run() - task 02083763-bbaf-5fc3-657d-00000000027e 13273 1726853292.43013: variable 'ansible_search_path' from source: unknown 13273 1726853292.43020: variable 'ansible_search_path' from source: unknown 13273 1726853292.43056: calling self._execute() 13273 1726853292.43139: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853292.43150: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853292.43162: variable 'omit' from source: magic vars 13273 1726853292.43510: variable 'ansible_distribution_major_version' from source: facts 13273 1726853292.43527: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853292.43739: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13273 1726853292.45904: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13273 1726853292.45909: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13273 1726853292.45924: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13273 1726853292.45964: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13273 1726853292.45995: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13273 1726853292.46378: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853292.46382: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853292.46385: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853292.46389: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853292.46409: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853292.46465: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853292.46621: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853292.46650: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853292.46695: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853292.46812: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853292.47068: variable '__network_required_facts' from source: role '' defaults 13273 1726853292.47177: variable 'ansible_facts' from source: unknown 13273 1726853292.47465: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 13273 1726853292.47469: when evaluation is False, skipping this task 13273 1726853292.47473: _execute() done 13273 1726853292.47475: dumping result to json 13273 1726853292.47478: done dumping result, returning 13273 1726853292.47480: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [02083763-bbaf-5fc3-657d-00000000027e] 13273 1726853292.47483: sending task result for task 02083763-bbaf-5fc3-657d-00000000027e 13273 1726853292.47549: done sending task result for task 02083763-bbaf-5fc3-657d-00000000027e 13273 1726853292.47552: WORKER PROCESS EXITING skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 13273 1726853292.47613: no more pending results, returning what we have 13273 1726853292.47616: results queue empty 13273 1726853292.47617: checking for any_errors_fatal 13273 1726853292.47619: done checking for any_errors_fatal 13273 1726853292.47620: checking for max_fail_percentage 13273 1726853292.47622: done checking for max_fail_percentage 13273 1726853292.47623: checking to see if all hosts have failed and the running result is not ok 13273 1726853292.47623: done checking to see if all hosts have failed 13273 1726853292.47624: getting the remaining hosts for this loop 13273 1726853292.47625: done getting the remaining hosts for this loop 13273 1726853292.47629: getting the next task for host managed_node3 13273 1726853292.47637: done getting next task for host managed_node3 13273 1726853292.47641: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 13273 1726853292.47645: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853292.47660: getting variables 13273 1726853292.47661: in VariableManager get_vars() 13273 1726853292.47718: Calling all_inventory to load vars for managed_node3 13273 1726853292.47721: Calling groups_inventory to load vars for managed_node3 13273 1726853292.47724: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853292.47734: Calling all_plugins_play to load vars for managed_node3 13273 1726853292.47737: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853292.47740: Calling groups_plugins_play to load vars for managed_node3 13273 1726853292.48423: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853292.48940: done with get_vars() 13273 1726853292.48951: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 13:28:12 -0400 (0:00:00.067) 0:00:10.379 ****** 13273 1726853292.49066: entering _queue_task() for managed_node3/stat 13273 1726853292.49716: worker is 1 (out of 1 available) 13273 1726853292.49727: exiting _queue_task() for managed_node3/stat 13273 1726853292.49738: done queuing things up, now waiting for results queue to drain 13273 1726853292.49739: waiting for pending results... 13273 1726853292.50144: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree 13273 1726853292.50849: in run() - task 02083763-bbaf-5fc3-657d-000000000280 13273 1726853292.50853: variable 'ansible_search_path' from source: unknown 13273 1726853292.50856: variable 'ansible_search_path' from source: unknown 13273 1726853292.50858: calling self._execute() 13273 1726853292.50916: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853292.51376: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853292.51380: variable 'omit' from source: magic vars 13273 1726853292.51986: variable 'ansible_distribution_major_version' from source: facts 13273 1726853292.52086: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853292.52254: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13273 1726853292.53280: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13273 1726853292.53520: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13273 1726853292.53567: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13273 1726853292.53708: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13273 1726853292.53813: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13273 1726853292.54087: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13273 1726853292.54119: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853292.54172: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13273 1726853292.54553: variable '__network_is_ostree' from source: set_fact 13273 1726853292.54776: Evaluated conditional (not __network_is_ostree is defined): False 13273 1726853292.54779: when evaluation is False, skipping this task 13273 1726853292.54782: _execute() done 13273 1726853292.54785: dumping result to json 13273 1726853292.54787: done dumping result, returning 13273 1726853292.54790: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree [02083763-bbaf-5fc3-657d-000000000280] 13273 1726853292.54792: sending task result for task 02083763-bbaf-5fc3-657d-000000000280 13273 1726853292.54867: done sending task result for task 02083763-bbaf-5fc3-657d-000000000280 13273 1726853292.54873: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 13273 1726853292.54930: no more pending results, returning what we have 13273 1726853292.54934: results queue empty 13273 1726853292.54935: checking for any_errors_fatal 13273 1726853292.54942: done checking for any_errors_fatal 13273 1726853292.54943: checking for max_fail_percentage 13273 1726853292.54945: done checking for max_fail_percentage 13273 1726853292.54946: checking to see if all hosts have failed and the running result is not ok 13273 1726853292.54947: done checking to see if all hosts have failed 13273 1726853292.54947: getting the remaining hosts for this loop 13273 1726853292.54949: done getting the remaining hosts for this loop 13273 1726853292.54953: getting the next task for host managed_node3 13273 1726853292.54962: done getting next task for host managed_node3 13273 1726853292.54966: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 13273 1726853292.54970: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853292.54985: getting variables 13273 1726853292.54987: in VariableManager get_vars() 13273 1726853292.55046: Calling all_inventory to load vars for managed_node3 13273 1726853292.55049: Calling groups_inventory to load vars for managed_node3 13273 1726853292.55052: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853292.55062: Calling all_plugins_play to load vars for managed_node3 13273 1726853292.55065: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853292.55069: Calling groups_plugins_play to load vars for managed_node3 13273 1726853292.55618: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853292.56096: done with get_vars() 13273 1726853292.56107: done getting variables 13273 1726853292.56214: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 13:28:12 -0400 (0:00:00.073) 0:00:10.453 ****** 13273 1726853292.56416: entering _queue_task() for managed_node3/set_fact 13273 1726853292.57234: worker is 1 (out of 1 available) 13273 1726853292.57246: exiting _queue_task() for managed_node3/set_fact 13273 1726853292.57258: done queuing things up, now waiting for results queue to drain 13273 1726853292.57259: waiting for pending results... 13273 1726853292.58034: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 13273 1726853292.58297: in run() - task 02083763-bbaf-5fc3-657d-000000000281 13273 1726853292.58321: variable 'ansible_search_path' from source: unknown 13273 1726853292.58395: variable 'ansible_search_path' from source: unknown 13273 1726853292.58429: calling self._execute() 13273 1726853292.58612: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853292.58721: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853292.58725: variable 'omit' from source: magic vars 13273 1726853292.59409: variable 'ansible_distribution_major_version' from source: facts 13273 1726853292.59517: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853292.59764: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13273 1726853292.60338: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13273 1726853292.60428: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13273 1726853292.60528: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13273 1726853292.60776: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13273 1726853292.60779: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13273 1726853292.60829: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13273 1726853292.60976: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853292.60988: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13273 1726853292.61122: variable '__network_is_ostree' from source: set_fact 13273 1726853292.61479: Evaluated conditional (not __network_is_ostree is defined): False 13273 1726853292.61483: when evaluation is False, skipping this task 13273 1726853292.61485: _execute() done 13273 1726853292.61488: dumping result to json 13273 1726853292.61490: done dumping result, returning 13273 1726853292.61493: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [02083763-bbaf-5fc3-657d-000000000281] 13273 1726853292.61495: sending task result for task 02083763-bbaf-5fc3-657d-000000000281 13273 1726853292.61565: done sending task result for task 02083763-bbaf-5fc3-657d-000000000281 13273 1726853292.61568: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 13273 1726853292.61631: no more pending results, returning what we have 13273 1726853292.61635: results queue empty 13273 1726853292.61636: checking for any_errors_fatal 13273 1726853292.61644: done checking for any_errors_fatal 13273 1726853292.61645: checking for max_fail_percentage 13273 1726853292.61647: done checking for max_fail_percentage 13273 1726853292.61648: checking to see if all hosts have failed and the running result is not ok 13273 1726853292.61649: done checking to see if all hosts have failed 13273 1726853292.61649: getting the remaining hosts for this loop 13273 1726853292.61651: done getting the remaining hosts for this loop 13273 1726853292.61656: getting the next task for host managed_node3 13273 1726853292.61666: done getting next task for host managed_node3 13273 1726853292.61672: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 13273 1726853292.61678: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853292.61694: getting variables 13273 1726853292.61696: in VariableManager get_vars() 13273 1726853292.61753: Calling all_inventory to load vars for managed_node3 13273 1726853292.61756: Calling groups_inventory to load vars for managed_node3 13273 1726853292.61759: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853292.61769: Calling all_plugins_play to load vars for managed_node3 13273 1726853292.61977: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853292.61981: Calling groups_plugins_play to load vars for managed_node3 13273 1726853292.62160: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853292.62824: done with get_vars() 13273 1726853292.62836: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 13:28:12 -0400 (0:00:00.065) 0:00:10.518 ****** 13273 1726853292.62931: entering _queue_task() for managed_node3/service_facts 13273 1726853292.62933: Creating lock for service_facts 13273 1726853292.63611: worker is 1 (out of 1 available) 13273 1726853292.63621: exiting _queue_task() for managed_node3/service_facts 13273 1726853292.63633: done queuing things up, now waiting for results queue to drain 13273 1726853292.63635: waiting for pending results... 13273 1726853292.63964: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running 13273 1726853292.64286: in run() - task 02083763-bbaf-5fc3-657d-000000000283 13273 1726853292.64301: variable 'ansible_search_path' from source: unknown 13273 1726853292.64304: variable 'ansible_search_path' from source: unknown 13273 1726853292.64336: calling self._execute() 13273 1726853292.64421: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853292.64428: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853292.64578: variable 'omit' from source: magic vars 13273 1726853292.65222: variable 'ansible_distribution_major_version' from source: facts 13273 1726853292.65390: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853292.65576: variable 'omit' from source: magic vars 13273 1726853292.65579: variable 'omit' from source: magic vars 13273 1726853292.65581: variable 'omit' from source: magic vars 13273 1726853292.65583: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13273 1726853292.65798: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13273 1726853292.65824: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13273 1726853292.65938: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853292.66189: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853292.66375: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13273 1726853292.66379: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853292.66381: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853292.66383: Set connection var ansible_connection to ssh 13273 1726853292.66385: Set connection var ansible_module_compression to ZIP_DEFLATED 13273 1726853292.66387: Set connection var ansible_shell_executable to /bin/sh 13273 1726853292.66389: Set connection var ansible_shell_type to sh 13273 1726853292.66391: Set connection var ansible_pipelining to False 13273 1726853292.66393: Set connection var ansible_timeout to 10 13273 1726853292.66395: variable 'ansible_shell_executable' from source: unknown 13273 1726853292.66397: variable 'ansible_connection' from source: unknown 13273 1726853292.66400: variable 'ansible_module_compression' from source: unknown 13273 1726853292.66776: variable 'ansible_shell_type' from source: unknown 13273 1726853292.66779: variable 'ansible_shell_executable' from source: unknown 13273 1726853292.66781: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853292.66783: variable 'ansible_pipelining' from source: unknown 13273 1726853292.66785: variable 'ansible_timeout' from source: unknown 13273 1726853292.66787: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853292.67141: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 13273 1726853292.67149: variable 'omit' from source: magic vars 13273 1726853292.67151: starting attempt loop 13273 1726853292.67153: running the handler 13273 1726853292.67155: _low_level_execute_command(): starting 13273 1726853292.67157: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13273 1726853292.68153: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853292.68387: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853292.68481: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853292.68785: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853292.70262: stdout chunk (state=3): >>>/root <<< 13273 1726853292.70404: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853292.70407: stderr chunk (state=3): >>><<< 13273 1726853292.70410: stdout chunk (state=3): >>><<< 13273 1726853292.70521: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853292.70526: _low_level_execute_command(): starting 13273 1726853292.70533: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853292.704289-13896-137195912565315 `" && echo ansible-tmp-1726853292.704289-13896-137195912565315="` echo /root/.ansible/tmp/ansible-tmp-1726853292.704289-13896-137195912565315 `" ) && sleep 0' 13273 1726853292.71582: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853292.71585: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853292.71588: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853292.71597: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 13273 1726853292.71599: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853292.71784: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853292.71966: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853292.73874: stdout chunk (state=3): >>>ansible-tmp-1726853292.704289-13896-137195912565315=/root/.ansible/tmp/ansible-tmp-1726853292.704289-13896-137195912565315 <<< 13273 1726853292.73999: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853292.74036: stderr chunk (state=3): >>><<< 13273 1726853292.74048: stdout chunk (state=3): >>><<< 13273 1726853292.74477: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853292.704289-13896-137195912565315=/root/.ansible/tmp/ansible-tmp-1726853292.704289-13896-137195912565315 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853292.74480: variable 'ansible_module_compression' from source: unknown 13273 1726853292.74482: ANSIBALLZ: Using lock for service_facts 13273 1726853292.74484: ANSIBALLZ: Acquiring lock 13273 1726853292.74486: ANSIBALLZ: Lock acquired: 140136090705296 13273 1726853292.74488: ANSIBALLZ: Creating module 13273 1726853292.95526: ANSIBALLZ: Writing module into payload 13273 1726853292.95877: ANSIBALLZ: Writing module 13273 1726853292.95908: ANSIBALLZ: Renaming module 13273 1726853292.95920: ANSIBALLZ: Done creating module 13273 1726853292.95941: variable 'ansible_facts' from source: unknown 13273 1726853292.96027: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853292.704289-13896-137195912565315/AnsiballZ_service_facts.py 13273 1726853292.96485: Sending initial data 13273 1726853292.96585: Sent initial data (161 bytes) 13273 1726853292.97761: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853292.97778: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 13273 1726853292.97789: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853292.97990: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 13273 1726853292.98003: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853292.98057: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853292.98067: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853292.98463: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853292.98542: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853293.00328: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13273 1726853293.00394: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13273 1726853293.00445: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13273pukop8ph/tmpe5oegi7r /root/.ansible/tmp/ansible-tmp-1726853292.704289-13896-137195912565315/AnsiballZ_service_facts.py <<< 13273 1726853293.00454: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853292.704289-13896-137195912565315/AnsiballZ_service_facts.py" <<< 13273 1726853293.00536: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13273pukop8ph/tmpe5oegi7r" to remote "/root/.ansible/tmp/ansible-tmp-1726853292.704289-13896-137195912565315/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853292.704289-13896-137195912565315/AnsiballZ_service_facts.py" <<< 13273 1726853293.02033: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853293.02091: stderr chunk (state=3): >>><<< 13273 1726853293.02100: stdout chunk (state=3): >>><<< 13273 1726853293.02125: done transferring module to remote 13273 1726853293.02139: _low_level_execute_command(): starting 13273 1726853293.02157: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853292.704289-13896-137195912565315/ /root/.ansible/tmp/ansible-tmp-1726853292.704289-13896-137195912565315/AnsiballZ_service_facts.py && sleep 0' 13273 1726853293.03578: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13273 1726853293.03582: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853293.03736: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853293.03751: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853293.03778: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853293.03888: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853293.05757: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853293.05761: stdout chunk (state=3): >>><<< 13273 1726853293.05767: stderr chunk (state=3): >>><<< 13273 1726853293.05893: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853293.05897: _low_level_execute_command(): starting 13273 1726853293.05901: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853292.704289-13896-137195912565315/AnsiballZ_service_facts.py && sleep 0' 13273 1726853293.06896: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13273 1726853293.07085: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853293.07191: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853293.07202: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853293.07223: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853293.07317: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853294.67398: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "st<<< 13273 1726853294.67416: stdout chunk (state=3): >>>opped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.s<<< 13273 1726853294.67436: stdout chunk (state=3): >>>ervice", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive",<<< 13273 1726853294.67465: stdout chunk (state=3): >>> "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "syste<<< 13273 1726853294.67474: stdout chunk (state=3): >>>md-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 13273 1726853294.69085: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 13273 1726853294.69135: stderr chunk (state=3): >>><<< 13273 1726853294.69150: stdout chunk (state=3): >>><<< 13273 1726853294.69172: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 13273 1726853294.69629: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853292.704289-13896-137195912565315/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13273 1726853294.69637: _low_level_execute_command(): starting 13273 1726853294.69643: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853292.704289-13896-137195912565315/ > /dev/null 2>&1 && sleep 0' 13273 1726853294.70337: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853294.70340: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 13273 1726853294.70342: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853294.70345: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853294.70347: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853294.70400: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853294.70405: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853294.70422: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853294.70503: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853294.72478: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853294.72501: stdout chunk (state=3): >>><<< 13273 1726853294.72504: stderr chunk (state=3): >>><<< 13273 1726853294.72680: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853294.72684: handler run complete 13273 1726853294.72735: variable 'ansible_facts' from source: unknown 13273 1726853294.74016: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853294.74546: variable 'ansible_facts' from source: unknown 13273 1726853294.74678: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853294.74880: attempt loop complete, returning result 13273 1726853294.74889: _execute() done 13273 1726853294.74896: dumping result to json 13273 1726853294.74977: done dumping result, returning 13273 1726853294.74991: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running [02083763-bbaf-5fc3-657d-000000000283] 13273 1726853294.75000: sending task result for task 02083763-bbaf-5fc3-657d-000000000283 13273 1726853294.76060: done sending task result for task 02083763-bbaf-5fc3-657d-000000000283 13273 1726853294.76063: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 13273 1726853294.76125: no more pending results, returning what we have 13273 1726853294.76128: results queue empty 13273 1726853294.76129: checking for any_errors_fatal 13273 1726853294.76133: done checking for any_errors_fatal 13273 1726853294.76134: checking for max_fail_percentage 13273 1726853294.76135: done checking for max_fail_percentage 13273 1726853294.76136: checking to see if all hosts have failed and the running result is not ok 13273 1726853294.76136: done checking to see if all hosts have failed 13273 1726853294.76137: getting the remaining hosts for this loop 13273 1726853294.76138: done getting the remaining hosts for this loop 13273 1726853294.76141: getting the next task for host managed_node3 13273 1726853294.76146: done getting next task for host managed_node3 13273 1726853294.76150: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 13273 1726853294.76153: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853294.76162: getting variables 13273 1726853294.76164: in VariableManager get_vars() 13273 1726853294.76322: Calling all_inventory to load vars for managed_node3 13273 1726853294.76325: Calling groups_inventory to load vars for managed_node3 13273 1726853294.76327: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853294.76336: Calling all_plugins_play to load vars for managed_node3 13273 1726853294.76338: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853294.76341: Calling groups_plugins_play to load vars for managed_node3 13273 1726853294.76778: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853294.77313: done with get_vars() 13273 1726853294.77326: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 13:28:14 -0400 (0:00:02.145) 0:00:12.663 ****** 13273 1726853294.77441: entering _queue_task() for managed_node3/package_facts 13273 1726853294.77443: Creating lock for package_facts 13273 1726853294.77839: worker is 1 (out of 1 available) 13273 1726853294.77848: exiting _queue_task() for managed_node3/package_facts 13273 1726853294.77861: done queuing things up, now waiting for results queue to drain 13273 1726853294.77862: waiting for pending results... 13273 1726853294.78243: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed 13273 1726853294.78282: in run() - task 02083763-bbaf-5fc3-657d-000000000284 13273 1726853294.78303: variable 'ansible_search_path' from source: unknown 13273 1726853294.78311: variable 'ansible_search_path' from source: unknown 13273 1726853294.78356: calling self._execute() 13273 1726853294.78458: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853294.78473: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853294.78499: variable 'omit' from source: magic vars 13273 1726853294.78925: variable 'ansible_distribution_major_version' from source: facts 13273 1726853294.78932: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853294.78935: variable 'omit' from source: magic vars 13273 1726853294.79008: variable 'omit' from source: magic vars 13273 1726853294.79062: variable 'omit' from source: magic vars 13273 1726853294.79141: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13273 1726853294.79170: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13273 1726853294.79197: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13273 1726853294.79250: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853294.79259: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853294.79290: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13273 1726853294.79319: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853294.79322: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853294.79428: Set connection var ansible_connection to ssh 13273 1726853294.79445: Set connection var ansible_module_compression to ZIP_DEFLATED 13273 1726853294.79467: Set connection var ansible_shell_executable to /bin/sh 13273 1726853294.79536: Set connection var ansible_shell_type to sh 13273 1726853294.79539: Set connection var ansible_pipelining to False 13273 1726853294.79541: Set connection var ansible_timeout to 10 13273 1726853294.79544: variable 'ansible_shell_executable' from source: unknown 13273 1726853294.79546: variable 'ansible_connection' from source: unknown 13273 1726853294.79549: variable 'ansible_module_compression' from source: unknown 13273 1726853294.79550: variable 'ansible_shell_type' from source: unknown 13273 1726853294.79552: variable 'ansible_shell_executable' from source: unknown 13273 1726853294.79554: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853294.79563: variable 'ansible_pipelining' from source: unknown 13273 1726853294.79582: variable 'ansible_timeout' from source: unknown 13273 1726853294.79594: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853294.79815: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 13273 1726853294.79832: variable 'omit' from source: magic vars 13273 1726853294.79841: starting attempt loop 13273 1726853294.79848: running the handler 13273 1726853294.79876: _low_level_execute_command(): starting 13273 1726853294.79894: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13273 1726853294.80622: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13273 1726853294.80626: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853294.80628: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853294.80632: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13273 1726853294.80689: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853294.80719: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853294.80767: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853294.80775: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853294.80841: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853294.82542: stdout chunk (state=3): >>>/root <<< 13273 1726853294.82649: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853294.82684: stderr chunk (state=3): >>><<< 13273 1726853294.82688: stdout chunk (state=3): >>><<< 13273 1726853294.82713: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853294.82725: _low_level_execute_command(): starting 13273 1726853294.82731: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853294.8271067-13975-68547303149613 `" && echo ansible-tmp-1726853294.8271067-13975-68547303149613="` echo /root/.ansible/tmp/ansible-tmp-1726853294.8271067-13975-68547303149613 `" ) && sleep 0' 13273 1726853294.83135: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853294.83176: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 13273 1726853294.83179: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853294.83189: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 13273 1726853294.83192: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853294.83229: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853294.83236: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853294.83238: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853294.83299: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853294.85287: stdout chunk (state=3): >>>ansible-tmp-1726853294.8271067-13975-68547303149613=/root/.ansible/tmp/ansible-tmp-1726853294.8271067-13975-68547303149613 <<< 13273 1726853294.85344: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853294.85430: stderr chunk (state=3): >>><<< 13273 1726853294.85434: stdout chunk (state=3): >>><<< 13273 1726853294.85441: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853294.8271067-13975-68547303149613=/root/.ansible/tmp/ansible-tmp-1726853294.8271067-13975-68547303149613 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853294.85449: variable 'ansible_module_compression' from source: unknown 13273 1726853294.85498: ANSIBALLZ: Using lock for package_facts 13273 1726853294.85502: ANSIBALLZ: Acquiring lock 13273 1726853294.85504: ANSIBALLZ: Lock acquired: 140136093119024 13273 1726853294.85507: ANSIBALLZ: Creating module 13273 1726853295.07566: ANSIBALLZ: Writing module into payload 13273 1726853295.07655: ANSIBALLZ: Writing module 13273 1726853295.07675: ANSIBALLZ: Renaming module 13273 1726853295.07681: ANSIBALLZ: Done creating module 13273 1726853295.07707: variable 'ansible_facts' from source: unknown 13273 1726853295.07887: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853294.8271067-13975-68547303149613/AnsiballZ_package_facts.py 13273 1726853295.08130: Sending initial data 13273 1726853295.08134: Sent initial data (161 bytes) 13273 1726853295.08718: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853295.08721: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853295.08796: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853295.10404: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13273 1726853295.10455: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13273 1726853295.10523: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13273pukop8ph/tmppj4wyv4n /root/.ansible/tmp/ansible-tmp-1726853294.8271067-13975-68547303149613/AnsiballZ_package_facts.py <<< 13273 1726853295.10527: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853294.8271067-13975-68547303149613/AnsiballZ_package_facts.py" <<< 13273 1726853295.10779: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13273pukop8ph/tmppj4wyv4n" to remote "/root/.ansible/tmp/ansible-tmp-1726853294.8271067-13975-68547303149613/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853294.8271067-13975-68547303149613/AnsiballZ_package_facts.py" <<< 13273 1726853295.13730: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853295.13734: stdout chunk (state=3): >>><<< 13273 1726853295.13742: stderr chunk (state=3): >>><<< 13273 1726853295.13761: done transferring module to remote 13273 1726853295.13774: _low_level_execute_command(): starting 13273 1726853295.13777: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853294.8271067-13975-68547303149613/ /root/.ansible/tmp/ansible-tmp-1726853294.8271067-13975-68547303149613/AnsiballZ_package_facts.py && sleep 0' 13273 1726853295.14940: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13273 1726853295.14943: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853295.14959: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853295.14981: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13273 1726853295.15012: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 13273 1726853295.15179: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853295.15412: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853295.15480: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853295.17411: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853295.17414: stdout chunk (state=3): >>><<< 13273 1726853295.17417: stderr chunk (state=3): >>><<< 13273 1726853295.17503: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853295.17506: _low_level_execute_command(): starting 13273 1726853295.17509: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853294.8271067-13975-68547303149613/AnsiballZ_package_facts.py && sleep 0' 13273 1726853295.18588: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853295.18784: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853295.18804: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853295.18904: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853295.63881: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "rele<<< 13273 1726853295.63906: stdout chunk (state=3): >>>ase": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "rele<<< 13273 1726853295.63911: stdout chunk (state=3): >>>ase": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null,<<< 13273 1726853295.63951: stdout chunk (state=3): >>> "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10",<<< 13273 1726853295.63963: stdout chunk (state=3): >>> "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "sou<<< 13273 1726853295.64051: stdout chunk (state=3): >>>rce": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"<<< 13273 1726853295.64062: stdout chunk (state=3): >>>}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10<<< 13273 1726853295.64118: stdout chunk (state=3): >>>", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.<<< 13273 1726853295.64122: stdout chunk (state=3): >>>26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 13273 1726853295.65947: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 13273 1726853295.65964: stderr chunk (state=3): >>><<< 13273 1726853295.65967: stdout chunk (state=3): >>><<< 13273 1726853295.66004: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 13273 1726853295.67103: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853294.8271067-13975-68547303149613/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13273 1726853295.67122: _low_level_execute_command(): starting 13273 1726853295.67127: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853294.8271067-13975-68547303149613/ > /dev/null 2>&1 && sleep 0' 13273 1726853295.67628: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853295.67631: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853295.67634: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853295.67636: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 13273 1726853295.67638: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853295.67713: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853295.67772: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853295.69641: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853295.69673: stderr chunk (state=3): >>><<< 13273 1726853295.69676: stdout chunk (state=3): >>><<< 13273 1726853295.69689: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853295.69695: handler run complete 13273 1726853295.70525: variable 'ansible_facts' from source: unknown 13273 1726853295.70811: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853295.72087: variable 'ansible_facts' from source: unknown 13273 1726853295.72318: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853295.72738: attempt loop complete, returning result 13273 1726853295.72762: _execute() done 13273 1726853295.72765: dumping result to json 13273 1726853295.72919: done dumping result, returning 13273 1726853295.72928: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed [02083763-bbaf-5fc3-657d-000000000284] 13273 1726853295.72932: sending task result for task 02083763-bbaf-5fc3-657d-000000000284 13273 1726853295.77835: done sending task result for task 02083763-bbaf-5fc3-657d-000000000284 13273 1726853295.77839: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 13273 1726853295.77878: no more pending results, returning what we have 13273 1726853295.77880: results queue empty 13273 1726853295.77881: checking for any_errors_fatal 13273 1726853295.77883: done checking for any_errors_fatal 13273 1726853295.77883: checking for max_fail_percentage 13273 1726853295.77884: done checking for max_fail_percentage 13273 1726853295.77885: checking to see if all hosts have failed and the running result is not ok 13273 1726853295.77885: done checking to see if all hosts have failed 13273 1726853295.77886: getting the remaining hosts for this loop 13273 1726853295.77886: done getting the remaining hosts for this loop 13273 1726853295.77889: getting the next task for host managed_node3 13273 1726853295.77893: done getting next task for host managed_node3 13273 1726853295.77895: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 13273 1726853295.77897: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853295.77902: getting variables 13273 1726853295.77903: in VariableManager get_vars() 13273 1726853295.77930: Calling all_inventory to load vars for managed_node3 13273 1726853295.77932: Calling groups_inventory to load vars for managed_node3 13273 1726853295.77933: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853295.77939: Calling all_plugins_play to load vars for managed_node3 13273 1726853295.77940: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853295.77942: Calling groups_plugins_play to load vars for managed_node3 13273 1726853295.78706: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853295.79721: done with get_vars() 13273 1726853295.79737: done getting variables 13273 1726853295.79790: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 13:28:15 -0400 (0:00:01.023) 0:00:13.687 ****** 13273 1726853295.79820: entering _queue_task() for managed_node3/debug 13273 1726853295.80061: worker is 1 (out of 1 available) 13273 1726853295.80075: exiting _queue_task() for managed_node3/debug 13273 1726853295.80086: done queuing things up, now waiting for results queue to drain 13273 1726853295.80087: waiting for pending results... 13273 1726853295.80286: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider 13273 1726853295.80413: in run() - task 02083763-bbaf-5fc3-657d-000000000027 13273 1726853295.80418: variable 'ansible_search_path' from source: unknown 13273 1726853295.80420: variable 'ansible_search_path' from source: unknown 13273 1726853295.80456: calling self._execute() 13273 1726853295.80515: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853295.80521: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853295.80530: variable 'omit' from source: magic vars 13273 1726853295.80833: variable 'ansible_distribution_major_version' from source: facts 13273 1726853295.80852: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853295.80860: variable 'omit' from source: magic vars 13273 1726853295.80921: variable 'omit' from source: magic vars 13273 1726853295.80986: variable 'network_provider' from source: set_fact 13273 1726853295.81000: variable 'omit' from source: magic vars 13273 1726853295.81053: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13273 1726853295.81067: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13273 1726853295.81086: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13273 1726853295.81099: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853295.81108: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853295.81137: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13273 1726853295.81141: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853295.81146: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853295.81234: Set connection var ansible_connection to ssh 13273 1726853295.81248: Set connection var ansible_module_compression to ZIP_DEFLATED 13273 1726853295.81251: Set connection var ansible_shell_executable to /bin/sh 13273 1726853295.81254: Set connection var ansible_shell_type to sh 13273 1726853295.81256: Set connection var ansible_pipelining to False 13273 1726853295.81259: Set connection var ansible_timeout to 10 13273 1726853295.81283: variable 'ansible_shell_executable' from source: unknown 13273 1726853295.81287: variable 'ansible_connection' from source: unknown 13273 1726853295.81290: variable 'ansible_module_compression' from source: unknown 13273 1726853295.81293: variable 'ansible_shell_type' from source: unknown 13273 1726853295.81295: variable 'ansible_shell_executable' from source: unknown 13273 1726853295.81297: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853295.81299: variable 'ansible_pipelining' from source: unknown 13273 1726853295.81302: variable 'ansible_timeout' from source: unknown 13273 1726853295.81310: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853295.81410: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13273 1726853295.81418: variable 'omit' from source: magic vars 13273 1726853295.81424: starting attempt loop 13273 1726853295.81426: running the handler 13273 1726853295.81463: handler run complete 13273 1726853295.81474: attempt loop complete, returning result 13273 1726853295.81477: _execute() done 13273 1726853295.81479: dumping result to json 13273 1726853295.81482: done dumping result, returning 13273 1726853295.81490: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider [02083763-bbaf-5fc3-657d-000000000027] 13273 1726853295.81495: sending task result for task 02083763-bbaf-5fc3-657d-000000000027 ok: [managed_node3] => {} MSG: Using network provider: nm 13273 1726853295.81639: no more pending results, returning what we have 13273 1726853295.81644: results queue empty 13273 1726853295.81645: checking for any_errors_fatal 13273 1726853295.81653: done checking for any_errors_fatal 13273 1726853295.81654: checking for max_fail_percentage 13273 1726853295.81655: done checking for max_fail_percentage 13273 1726853295.81656: checking to see if all hosts have failed and the running result is not ok 13273 1726853295.81657: done checking to see if all hosts have failed 13273 1726853295.81657: getting the remaining hosts for this loop 13273 1726853295.81659: done getting the remaining hosts for this loop 13273 1726853295.81662: getting the next task for host managed_node3 13273 1726853295.81667: done getting next task for host managed_node3 13273 1726853295.81673: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 13273 1726853295.81676: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853295.81687: getting variables 13273 1726853295.81689: in VariableManager get_vars() 13273 1726853295.81730: Calling all_inventory to load vars for managed_node3 13273 1726853295.81733: Calling groups_inventory to load vars for managed_node3 13273 1726853295.81735: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853295.81744: Calling all_plugins_play to load vars for managed_node3 13273 1726853295.81746: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853295.81749: Calling groups_plugins_play to load vars for managed_node3 13273 1726853295.82331: done sending task result for task 02083763-bbaf-5fc3-657d-000000000027 13273 1726853295.82334: WORKER PROCESS EXITING 13273 1726853295.82590: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853295.83489: done with get_vars() 13273 1726853295.83505: done getting variables 13273 1726853295.83573: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 13:28:15 -0400 (0:00:00.037) 0:00:13.725 ****** 13273 1726853295.83595: entering _queue_task() for managed_node3/fail 13273 1726853295.83597: Creating lock for fail 13273 1726853295.83815: worker is 1 (out of 1 available) 13273 1726853295.83829: exiting _queue_task() for managed_node3/fail 13273 1726853295.83841: done queuing things up, now waiting for results queue to drain 13273 1726853295.83842: waiting for pending results... 13273 1726853295.84021: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 13273 1726853295.84187: in run() - task 02083763-bbaf-5fc3-657d-000000000028 13273 1726853295.84192: variable 'ansible_search_path' from source: unknown 13273 1726853295.84204: variable 'ansible_search_path' from source: unknown 13273 1726853295.84252: calling self._execute() 13273 1726853295.84332: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853295.84337: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853295.84348: variable 'omit' from source: magic vars 13273 1726853295.84679: variable 'ansible_distribution_major_version' from source: facts 13273 1726853295.84689: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853295.84770: variable 'network_state' from source: role '' defaults 13273 1726853295.84782: Evaluated conditional (network_state != {}): False 13273 1726853295.84786: when evaluation is False, skipping this task 13273 1726853295.84789: _execute() done 13273 1726853295.84792: dumping result to json 13273 1726853295.84796: done dumping result, returning 13273 1726853295.84799: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [02083763-bbaf-5fc3-657d-000000000028] 13273 1726853295.84805: sending task result for task 02083763-bbaf-5fc3-657d-000000000028 13273 1726853295.84894: done sending task result for task 02083763-bbaf-5fc3-657d-000000000028 13273 1726853295.84897: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 13273 1726853295.84967: no more pending results, returning what we have 13273 1726853295.84977: results queue empty 13273 1726853295.84979: checking for any_errors_fatal 13273 1726853295.84982: done checking for any_errors_fatal 13273 1726853295.84983: checking for max_fail_percentage 13273 1726853295.84985: done checking for max_fail_percentage 13273 1726853295.84985: checking to see if all hosts have failed and the running result is not ok 13273 1726853295.84986: done checking to see if all hosts have failed 13273 1726853295.84987: getting the remaining hosts for this loop 13273 1726853295.84988: done getting the remaining hosts for this loop 13273 1726853295.84991: getting the next task for host managed_node3 13273 1726853295.85004: done getting next task for host managed_node3 13273 1726853295.85011: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 13273 1726853295.85014: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853295.85036: getting variables 13273 1726853295.85037: in VariableManager get_vars() 13273 1726853295.85101: Calling all_inventory to load vars for managed_node3 13273 1726853295.85104: Calling groups_inventory to load vars for managed_node3 13273 1726853295.85106: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853295.85112: Calling all_plugins_play to load vars for managed_node3 13273 1726853295.85114: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853295.85116: Calling groups_plugins_play to load vars for managed_node3 13273 1726853295.86045: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853295.87185: done with get_vars() 13273 1726853295.87210: done getting variables 13273 1726853295.87264: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 13:28:15 -0400 (0:00:00.036) 0:00:13.762 ****** 13273 1726853295.87299: entering _queue_task() for managed_node3/fail 13273 1726853295.87805: worker is 1 (out of 1 available) 13273 1726853295.87815: exiting _queue_task() for managed_node3/fail 13273 1726853295.87825: done queuing things up, now waiting for results queue to drain 13273 1726853295.87826: waiting for pending results... 13273 1726853295.87988: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 13273 1726853295.88061: in run() - task 02083763-bbaf-5fc3-657d-000000000029 13273 1726853295.88083: variable 'ansible_search_path' from source: unknown 13273 1726853295.88090: variable 'ansible_search_path' from source: unknown 13273 1726853295.88131: calling self._execute() 13273 1726853295.88278: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853295.88281: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853295.88284: variable 'omit' from source: magic vars 13273 1726853295.88617: variable 'ansible_distribution_major_version' from source: facts 13273 1726853295.88635: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853295.88752: variable 'network_state' from source: role '' defaults 13273 1726853295.88767: Evaluated conditional (network_state != {}): False 13273 1726853295.88777: when evaluation is False, skipping this task 13273 1726853295.88783: _execute() done 13273 1726853295.88790: dumping result to json 13273 1726853295.88797: done dumping result, returning 13273 1726853295.88819: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [02083763-bbaf-5fc3-657d-000000000029] 13273 1726853295.88822: sending task result for task 02083763-bbaf-5fc3-657d-000000000029 skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 13273 1726853295.89021: no more pending results, returning what we have 13273 1726853295.89025: results queue empty 13273 1726853295.89026: checking for any_errors_fatal 13273 1726853295.89036: done checking for any_errors_fatal 13273 1726853295.89037: checking for max_fail_percentage 13273 1726853295.89039: done checking for max_fail_percentage 13273 1726853295.89040: checking to see if all hosts have failed and the running result is not ok 13273 1726853295.89040: done checking to see if all hosts have failed 13273 1726853295.89041: getting the remaining hosts for this loop 13273 1726853295.89043: done getting the remaining hosts for this loop 13273 1726853295.89046: getting the next task for host managed_node3 13273 1726853295.89053: done getting next task for host managed_node3 13273 1726853295.89058: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 13273 1726853295.89061: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853295.89080: getting variables 13273 1726853295.89082: in VariableManager get_vars() 13273 1726853295.89139: Calling all_inventory to load vars for managed_node3 13273 1726853295.89142: Calling groups_inventory to load vars for managed_node3 13273 1726853295.89144: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853295.89155: Calling all_plugins_play to load vars for managed_node3 13273 1726853295.89158: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853295.89161: Calling groups_plugins_play to load vars for managed_node3 13273 1726853295.89786: done sending task result for task 02083763-bbaf-5fc3-657d-000000000029 13273 1726853295.89790: WORKER PROCESS EXITING 13273 1726853295.90159: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853295.91012: done with get_vars() 13273 1726853295.91029: done getting variables 13273 1726853295.91074: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 13:28:15 -0400 (0:00:00.037) 0:00:13.800 ****** 13273 1726853295.91096: entering _queue_task() for managed_node3/fail 13273 1726853295.91321: worker is 1 (out of 1 available) 13273 1726853295.91333: exiting _queue_task() for managed_node3/fail 13273 1726853295.91345: done queuing things up, now waiting for results queue to drain 13273 1726853295.91346: waiting for pending results... 13273 1726853295.91526: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 13273 1726853295.91609: in run() - task 02083763-bbaf-5fc3-657d-00000000002a 13273 1726853295.91621: variable 'ansible_search_path' from source: unknown 13273 1726853295.91625: variable 'ansible_search_path' from source: unknown 13273 1726853295.91654: calling self._execute() 13273 1726853295.91722: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853295.91727: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853295.91736: variable 'omit' from source: magic vars 13273 1726853295.92125: variable 'ansible_distribution_major_version' from source: facts 13273 1726853295.92132: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853295.92576: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13273 1726853295.94614: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13273 1726853295.94704: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13273 1726853295.94748: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13273 1726853295.94792: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13273 1726853295.94823: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13273 1726853295.94907: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853295.94951: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853295.94983: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853295.95028: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853295.95051: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853295.95156: variable 'ansible_distribution_major_version' from source: facts 13273 1726853295.95182: Evaluated conditional (ansible_distribution_major_version | int > 9): True 13273 1726853295.95306: variable 'ansible_distribution' from source: facts 13273 1726853295.95317: variable '__network_rh_distros' from source: role '' defaults 13273 1726853295.95331: Evaluated conditional (ansible_distribution in __network_rh_distros): True 13273 1726853295.95602: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853295.95631: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853295.95663: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853295.95713: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853295.95733: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853295.95788: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853295.95821: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853295.95853: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853295.95926: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853295.95929: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853295.95972: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853295.96001: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853295.96035: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853295.96146: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853295.96149: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853295.96434: variable 'network_connections' from source: task vars 13273 1726853295.96454: variable 'controller_profile' from source: play vars 13273 1726853295.96524: variable 'controller_profile' from source: play vars 13273 1726853295.96539: variable 'controller_device' from source: play vars 13273 1726853295.96610: variable 'controller_device' from source: play vars 13273 1726853295.96627: variable 'port1_profile' from source: play vars 13273 1726853295.96699: variable 'port1_profile' from source: play vars 13273 1726853295.96711: variable 'dhcp_interface1' from source: play vars 13273 1726853295.96775: variable 'dhcp_interface1' from source: play vars 13273 1726853295.96787: variable 'controller_profile' from source: play vars 13273 1726853295.96878: variable 'controller_profile' from source: play vars 13273 1726853295.96881: variable 'port2_profile' from source: play vars 13273 1726853295.96925: variable 'port2_profile' from source: play vars 13273 1726853295.96936: variable 'dhcp_interface2' from source: play vars 13273 1726853295.97003: variable 'dhcp_interface2' from source: play vars 13273 1726853295.97015: variable 'controller_profile' from source: play vars 13273 1726853295.97092: variable 'controller_profile' from source: play vars 13273 1726853295.97111: variable 'network_state' from source: role '' defaults 13273 1726853295.97213: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13273 1726853295.97374: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13273 1726853295.97417: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13273 1726853295.97449: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13273 1726853295.97468: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13273 1726853295.97513: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13273 1726853295.97529: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13273 1726853295.97549: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853295.97570: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13273 1726853295.97603: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 13273 1726853295.97606: when evaluation is False, skipping this task 13273 1726853295.97609: _execute() done 13273 1726853295.97611: dumping result to json 13273 1726853295.97613: done dumping result, returning 13273 1726853295.97620: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [02083763-bbaf-5fc3-657d-00000000002a] 13273 1726853295.97623: sending task result for task 02083763-bbaf-5fc3-657d-00000000002a 13273 1726853295.97726: done sending task result for task 02083763-bbaf-5fc3-657d-00000000002a 13273 1726853295.97728: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 13273 1726853295.97803: no more pending results, returning what we have 13273 1726853295.97807: results queue empty 13273 1726853295.97808: checking for any_errors_fatal 13273 1726853295.97814: done checking for any_errors_fatal 13273 1726853295.97815: checking for max_fail_percentage 13273 1726853295.97816: done checking for max_fail_percentage 13273 1726853295.97817: checking to see if all hosts have failed and the running result is not ok 13273 1726853295.97818: done checking to see if all hosts have failed 13273 1726853295.97819: getting the remaining hosts for this loop 13273 1726853295.97821: done getting the remaining hosts for this loop 13273 1726853295.97824: getting the next task for host managed_node3 13273 1726853295.97830: done getting next task for host managed_node3 13273 1726853295.97834: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 13273 1726853295.97838: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853295.97853: getting variables 13273 1726853295.97855: in VariableManager get_vars() 13273 1726853295.97909: Calling all_inventory to load vars for managed_node3 13273 1726853295.97912: Calling groups_inventory to load vars for managed_node3 13273 1726853295.97914: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853295.97922: Calling all_plugins_play to load vars for managed_node3 13273 1726853295.97924: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853295.97927: Calling groups_plugins_play to load vars for managed_node3 13273 1726853295.98705: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853295.99661: done with get_vars() 13273 1726853295.99679: done getting variables 13273 1726853295.99766: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 13:28:15 -0400 (0:00:00.086) 0:00:13.887 ****** 13273 1726853295.99791: entering _queue_task() for managed_node3/dnf 13273 1726853296.00113: worker is 1 (out of 1 available) 13273 1726853296.00126: exiting _queue_task() for managed_node3/dnf 13273 1726853296.00137: done queuing things up, now waiting for results queue to drain 13273 1726853296.00138: waiting for pending results... 13273 1726853296.00496: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 13273 1726853296.00570: in run() - task 02083763-bbaf-5fc3-657d-00000000002b 13273 1726853296.00597: variable 'ansible_search_path' from source: unknown 13273 1726853296.00775: variable 'ansible_search_path' from source: unknown 13273 1726853296.00780: calling self._execute() 13273 1726853296.00782: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853296.00786: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853296.00788: variable 'omit' from source: magic vars 13273 1726853296.01125: variable 'ansible_distribution_major_version' from source: facts 13273 1726853296.01134: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853296.01270: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13273 1726853296.02782: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13273 1726853296.02833: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13273 1726853296.02863: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13273 1726853296.02890: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13273 1726853296.02909: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13273 1726853296.02973: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853296.02993: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853296.03010: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853296.03039: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853296.03053: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853296.03129: variable 'ansible_distribution' from source: facts 13273 1726853296.03133: variable 'ansible_distribution_major_version' from source: facts 13273 1726853296.03151: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 13273 1726853296.03224: variable '__network_wireless_connections_defined' from source: role '' defaults 13273 1726853296.03311: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853296.03327: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853296.03343: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853296.03374: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853296.03390: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853296.03418: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853296.03434: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853296.03451: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853296.03483: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853296.03494: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853296.03520: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853296.03535: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853296.03552: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853296.03581: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853296.03590: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853296.03694: variable 'network_connections' from source: task vars 13273 1726853296.03697: variable 'controller_profile' from source: play vars 13273 1726853296.03740: variable 'controller_profile' from source: play vars 13273 1726853296.03748: variable 'controller_device' from source: play vars 13273 1726853296.03837: variable 'controller_device' from source: play vars 13273 1726853296.03840: variable 'port1_profile' from source: play vars 13273 1726853296.03859: variable 'port1_profile' from source: play vars 13273 1726853296.03866: variable 'dhcp_interface1' from source: play vars 13273 1726853296.03908: variable 'dhcp_interface1' from source: play vars 13273 1726853296.03915: variable 'controller_profile' from source: play vars 13273 1726853296.04176: variable 'controller_profile' from source: play vars 13273 1726853296.04179: variable 'port2_profile' from source: play vars 13273 1726853296.04181: variable 'port2_profile' from source: play vars 13273 1726853296.04183: variable 'dhcp_interface2' from source: play vars 13273 1726853296.04185: variable 'dhcp_interface2' from source: play vars 13273 1726853296.04187: variable 'controller_profile' from source: play vars 13273 1726853296.04189: variable 'controller_profile' from source: play vars 13273 1726853296.04249: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13273 1726853296.04413: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13273 1726853296.04453: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13273 1726853296.04490: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13273 1726853296.04522: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13273 1726853296.04566: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13273 1726853296.04611: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13273 1726853296.04642: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853296.04674: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13273 1726853296.04736: variable '__network_team_connections_defined' from source: role '' defaults 13273 1726853296.04973: variable 'network_connections' from source: task vars 13273 1726853296.04984: variable 'controller_profile' from source: play vars 13273 1726853296.05047: variable 'controller_profile' from source: play vars 13273 1726853296.05059: variable 'controller_device' from source: play vars 13273 1726853296.05123: variable 'controller_device' from source: play vars 13273 1726853296.05137: variable 'port1_profile' from source: play vars 13273 1726853296.05200: variable 'port1_profile' from source: play vars 13273 1726853296.05212: variable 'dhcp_interface1' from source: play vars 13273 1726853296.05273: variable 'dhcp_interface1' from source: play vars 13273 1726853296.05286: variable 'controller_profile' from source: play vars 13273 1726853296.05344: variable 'controller_profile' from source: play vars 13273 1726853296.05357: variable 'port2_profile' from source: play vars 13273 1726853296.05418: variable 'port2_profile' from source: play vars 13273 1726853296.05431: variable 'dhcp_interface2' from source: play vars 13273 1726853296.05493: variable 'dhcp_interface2' from source: play vars 13273 1726853296.05505: variable 'controller_profile' from source: play vars 13273 1726853296.05563: variable 'controller_profile' from source: play vars 13273 1726853296.05602: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 13273 1726853296.05611: when evaluation is False, skipping this task 13273 1726853296.05618: _execute() done 13273 1726853296.05625: dumping result to json 13273 1726853296.05631: done dumping result, returning 13273 1726853296.05642: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [02083763-bbaf-5fc3-657d-00000000002b] 13273 1726853296.05776: sending task result for task 02083763-bbaf-5fc3-657d-00000000002b 13273 1726853296.05851: done sending task result for task 02083763-bbaf-5fc3-657d-00000000002b 13273 1726853296.05855: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 13273 1726853296.05906: no more pending results, returning what we have 13273 1726853296.05909: results queue empty 13273 1726853296.05910: checking for any_errors_fatal 13273 1726853296.05917: done checking for any_errors_fatal 13273 1726853296.05918: checking for max_fail_percentage 13273 1726853296.05919: done checking for max_fail_percentage 13273 1726853296.05920: checking to see if all hosts have failed and the running result is not ok 13273 1726853296.05921: done checking to see if all hosts have failed 13273 1726853296.05921: getting the remaining hosts for this loop 13273 1726853296.05923: done getting the remaining hosts for this loop 13273 1726853296.05934: getting the next task for host managed_node3 13273 1726853296.05941: done getting next task for host managed_node3 13273 1726853296.05947: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 13273 1726853296.05950: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853296.05964: getting variables 13273 1726853296.05965: in VariableManager get_vars() 13273 1726853296.06013: Calling all_inventory to load vars for managed_node3 13273 1726853296.06015: Calling groups_inventory to load vars for managed_node3 13273 1726853296.06017: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853296.06025: Calling all_plugins_play to load vars for managed_node3 13273 1726853296.06027: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853296.06029: Calling groups_plugins_play to load vars for managed_node3 13273 1726853296.07356: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853296.08879: done with get_vars() 13273 1726853296.08907: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 13273 1726853296.08985: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 13:28:16 -0400 (0:00:00.092) 0:00:13.979 ****** 13273 1726853296.09018: entering _queue_task() for managed_node3/yum 13273 1726853296.09020: Creating lock for yum 13273 1726853296.09357: worker is 1 (out of 1 available) 13273 1726853296.09370: exiting _queue_task() for managed_node3/yum 13273 1726853296.09384: done queuing things up, now waiting for results queue to drain 13273 1726853296.09385: waiting for pending results... 13273 1726853296.09794: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 13273 1726853296.09799: in run() - task 02083763-bbaf-5fc3-657d-00000000002c 13273 1726853296.09812: variable 'ansible_search_path' from source: unknown 13273 1726853296.09821: variable 'ansible_search_path' from source: unknown 13273 1726853296.09860: calling self._execute() 13273 1726853296.09953: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853296.09966: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853296.09982: variable 'omit' from source: magic vars 13273 1726853296.10339: variable 'ansible_distribution_major_version' from source: facts 13273 1726853296.10356: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853296.10526: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13273 1726853296.12733: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13273 1726853296.12813: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13273 1726853296.12858: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13273 1726853296.12899: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13273 1726853296.12934: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13273 1726853296.13015: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853296.13055: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853296.13088: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853296.13134: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853296.13159: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853296.13258: variable 'ansible_distribution_major_version' from source: facts 13273 1726853296.13280: Evaluated conditional (ansible_distribution_major_version | int < 8): False 13273 1726853296.13288: when evaluation is False, skipping this task 13273 1726853296.13295: _execute() done 13273 1726853296.13302: dumping result to json 13273 1726853296.13309: done dumping result, returning 13273 1726853296.13366: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [02083763-bbaf-5fc3-657d-00000000002c] 13273 1726853296.13370: sending task result for task 02083763-bbaf-5fc3-657d-00000000002c 13273 1726853296.13445: done sending task result for task 02083763-bbaf-5fc3-657d-00000000002c 13273 1726853296.13448: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 13273 1726853296.13521: no more pending results, returning what we have 13273 1726853296.13525: results queue empty 13273 1726853296.13526: checking for any_errors_fatal 13273 1726853296.13532: done checking for any_errors_fatal 13273 1726853296.13533: checking for max_fail_percentage 13273 1726853296.13536: done checking for max_fail_percentage 13273 1726853296.13537: checking to see if all hosts have failed and the running result is not ok 13273 1726853296.13537: done checking to see if all hosts have failed 13273 1726853296.13538: getting the remaining hosts for this loop 13273 1726853296.13539: done getting the remaining hosts for this loop 13273 1726853296.13543: getting the next task for host managed_node3 13273 1726853296.13550: done getting next task for host managed_node3 13273 1726853296.13554: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 13273 1726853296.13557: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853296.13775: getting variables 13273 1726853296.13777: in VariableManager get_vars() 13273 1726853296.13826: Calling all_inventory to load vars for managed_node3 13273 1726853296.13829: Calling groups_inventory to load vars for managed_node3 13273 1726853296.13831: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853296.13840: Calling all_plugins_play to load vars for managed_node3 13273 1726853296.13843: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853296.13846: Calling groups_plugins_play to load vars for managed_node3 13273 1726853296.15334: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853296.16826: done with get_vars() 13273 1726853296.16847: done getting variables 13273 1726853296.16908: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 13:28:16 -0400 (0:00:00.079) 0:00:14.058 ****** 13273 1726853296.16940: entering _queue_task() for managed_node3/fail 13273 1726853296.17261: worker is 1 (out of 1 available) 13273 1726853296.17274: exiting _queue_task() for managed_node3/fail 13273 1726853296.17286: done queuing things up, now waiting for results queue to drain 13273 1726853296.17287: waiting for pending results... 13273 1726853296.17628: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 13273 1726853296.17824: in run() - task 02083763-bbaf-5fc3-657d-00000000002d 13273 1726853296.17828: variable 'ansible_search_path' from source: unknown 13273 1726853296.17831: variable 'ansible_search_path' from source: unknown 13273 1726853296.17833: calling self._execute() 13273 1726853296.17836: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853296.17839: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853296.17842: variable 'omit' from source: magic vars 13273 1726853296.18251: variable 'ansible_distribution_major_version' from source: facts 13273 1726853296.18268: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853296.18406: variable '__network_wireless_connections_defined' from source: role '' defaults 13273 1726853296.18622: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13273 1726853296.20104: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13273 1726853296.20156: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13273 1726853296.20190: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13273 1726853296.20212: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13273 1726853296.20231: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13273 1726853296.20290: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853296.20314: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853296.20332: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853296.20359: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853296.20370: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853296.20409: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853296.20423: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853296.20440: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853296.20465: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853296.20477: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853296.20505: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853296.20527: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853296.20541: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853296.20565: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853296.20577: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853296.20692: variable 'network_connections' from source: task vars 13273 1726853296.20703: variable 'controller_profile' from source: play vars 13273 1726853296.20752: variable 'controller_profile' from source: play vars 13273 1726853296.20760: variable 'controller_device' from source: play vars 13273 1726853296.20804: variable 'controller_device' from source: play vars 13273 1726853296.20811: variable 'port1_profile' from source: play vars 13273 1726853296.20855: variable 'port1_profile' from source: play vars 13273 1726853296.20862: variable 'dhcp_interface1' from source: play vars 13273 1726853296.20966: variable 'dhcp_interface1' from source: play vars 13273 1726853296.20970: variable 'controller_profile' from source: play vars 13273 1726853296.20974: variable 'controller_profile' from source: play vars 13273 1726853296.21016: variable 'port2_profile' from source: play vars 13273 1726853296.21044: variable 'port2_profile' from source: play vars 13273 1726853296.21140: variable 'dhcp_interface2' from source: play vars 13273 1726853296.21145: variable 'dhcp_interface2' from source: play vars 13273 1726853296.21148: variable 'controller_profile' from source: play vars 13273 1726853296.21381: variable 'controller_profile' from source: play vars 13273 1726853296.21385: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13273 1726853296.21409: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13273 1726853296.21449: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13273 1726853296.21495: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13273 1726853296.21534: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13273 1726853296.21582: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13273 1726853296.21608: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13273 1726853296.21645: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853296.21683: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13273 1726853296.21757: variable '__network_team_connections_defined' from source: role '' defaults 13273 1726853296.21990: variable 'network_connections' from source: task vars 13273 1726853296.21993: variable 'controller_profile' from source: play vars 13273 1726853296.22041: variable 'controller_profile' from source: play vars 13273 1726853296.22044: variable 'controller_device' from source: play vars 13273 1726853296.22166: variable 'controller_device' from source: play vars 13273 1726853296.22169: variable 'port1_profile' from source: play vars 13273 1726853296.22178: variable 'port1_profile' from source: play vars 13273 1726853296.22181: variable 'dhcp_interface1' from source: play vars 13273 1726853296.22233: variable 'dhcp_interface1' from source: play vars 13273 1726853296.22256: variable 'controller_profile' from source: play vars 13273 1726853296.22299: variable 'controller_profile' from source: play vars 13273 1726853296.22312: variable 'port2_profile' from source: play vars 13273 1726853296.22379: variable 'port2_profile' from source: play vars 13273 1726853296.22394: variable 'dhcp_interface2' from source: play vars 13273 1726853296.22456: variable 'dhcp_interface2' from source: play vars 13273 1726853296.22466: variable 'controller_profile' from source: play vars 13273 1726853296.22531: variable 'controller_profile' from source: play vars 13273 1726853296.22569: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 13273 1726853296.22602: when evaluation is False, skipping this task 13273 1726853296.22605: _execute() done 13273 1726853296.22607: dumping result to json 13273 1726853296.22609: done dumping result, returning 13273 1726853296.22611: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [02083763-bbaf-5fc3-657d-00000000002d] 13273 1726853296.22616: sending task result for task 02083763-bbaf-5fc3-657d-00000000002d 13273 1726853296.22951: done sending task result for task 02083763-bbaf-5fc3-657d-00000000002d 13273 1726853296.22954: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 13273 1726853296.22999: no more pending results, returning what we have 13273 1726853296.23003: results queue empty 13273 1726853296.23004: checking for any_errors_fatal 13273 1726853296.23008: done checking for any_errors_fatal 13273 1726853296.23009: checking for max_fail_percentage 13273 1726853296.23011: done checking for max_fail_percentage 13273 1726853296.23012: checking to see if all hosts have failed and the running result is not ok 13273 1726853296.23012: done checking to see if all hosts have failed 13273 1726853296.23013: getting the remaining hosts for this loop 13273 1726853296.23014: done getting the remaining hosts for this loop 13273 1726853296.23017: getting the next task for host managed_node3 13273 1726853296.23024: done getting next task for host managed_node3 13273 1726853296.23027: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 13273 1726853296.23030: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853296.23045: getting variables 13273 1726853296.23046: in VariableManager get_vars() 13273 1726853296.23090: Calling all_inventory to load vars for managed_node3 13273 1726853296.23092: Calling groups_inventory to load vars for managed_node3 13273 1726853296.23093: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853296.23100: Calling all_plugins_play to load vars for managed_node3 13273 1726853296.23102: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853296.23103: Calling groups_plugins_play to load vars for managed_node3 13273 1726853296.23869: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853296.24737: done with get_vars() 13273 1726853296.24757: done getting variables 13273 1726853296.24802: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 13:28:16 -0400 (0:00:00.078) 0:00:14.137 ****** 13273 1726853296.24824: entering _queue_task() for managed_node3/package 13273 1726853296.25055: worker is 1 (out of 1 available) 13273 1726853296.25067: exiting _queue_task() for managed_node3/package 13273 1726853296.25082: done queuing things up, now waiting for results queue to drain 13273 1726853296.25083: waiting for pending results... 13273 1726853296.25249: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages 13273 1726853296.25330: in run() - task 02083763-bbaf-5fc3-657d-00000000002e 13273 1726853296.25344: variable 'ansible_search_path' from source: unknown 13273 1726853296.25348: variable 'ansible_search_path' from source: unknown 13273 1726853296.25375: calling self._execute() 13273 1726853296.25440: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853296.25447: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853296.25454: variable 'omit' from source: magic vars 13273 1726853296.25908: variable 'ansible_distribution_major_version' from source: facts 13273 1726853296.25912: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853296.26081: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13273 1726853296.26404: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13273 1726853296.26490: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13273 1726853296.26532: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13273 1726853296.26590: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13273 1726853296.26724: variable 'network_packages' from source: role '' defaults 13273 1726853296.26853: variable '__network_provider_setup' from source: role '' defaults 13273 1726853296.26888: variable '__network_service_name_default_nm' from source: role '' defaults 13273 1726853296.26960: variable '__network_service_name_default_nm' from source: role '' defaults 13273 1726853296.26994: variable '__network_packages_default_nm' from source: role '' defaults 13273 1726853296.27061: variable '__network_packages_default_nm' from source: role '' defaults 13273 1726853296.27316: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13273 1726853296.33950: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13273 1726853296.34036: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13273 1726853296.34078: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13273 1726853296.34102: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13273 1726853296.34133: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13273 1726853296.34183: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853296.34203: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853296.34223: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853296.34255: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853296.34278: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853296.34309: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853296.34328: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853296.34347: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853296.34376: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853296.34384: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853296.34545: variable '__network_packages_default_gobject_packages' from source: role '' defaults 13273 1726853296.34647: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853296.34665: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853296.34700: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853296.34729: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853296.34749: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853296.34822: variable 'ansible_python' from source: facts 13273 1726853296.34847: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 13273 1726853296.34905: variable '__network_wpa_supplicant_required' from source: role '' defaults 13273 1726853296.34975: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 13273 1726853296.35085: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853296.35103: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853296.35120: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853296.35149: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853296.35159: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853296.35194: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853296.35215: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853296.35231: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853296.35275: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853296.35284: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853296.35390: variable 'network_connections' from source: task vars 13273 1726853296.35405: variable 'controller_profile' from source: play vars 13273 1726853296.35504: variable 'controller_profile' from source: play vars 13273 1726853296.35507: variable 'controller_device' from source: play vars 13273 1726853296.35612: variable 'controller_device' from source: play vars 13273 1726853296.35615: variable 'port1_profile' from source: play vars 13273 1726853296.35693: variable 'port1_profile' from source: play vars 13273 1726853296.35705: variable 'dhcp_interface1' from source: play vars 13273 1726853296.35772: variable 'dhcp_interface1' from source: play vars 13273 1726853296.35779: variable 'controller_profile' from source: play vars 13273 1726853296.35858: variable 'controller_profile' from source: play vars 13273 1726853296.35866: variable 'port2_profile' from source: play vars 13273 1726853296.35947: variable 'port2_profile' from source: play vars 13273 1726853296.35955: variable 'dhcp_interface2' from source: play vars 13273 1726853296.36049: variable 'dhcp_interface2' from source: play vars 13273 1726853296.36052: variable 'controller_profile' from source: play vars 13273 1726853296.36120: variable 'controller_profile' from source: play vars 13273 1726853296.36178: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13273 1726853296.36198: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13273 1726853296.36221: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853296.36255: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13273 1726853296.36294: variable '__network_wireless_connections_defined' from source: role '' defaults 13273 1726853296.36493: variable 'network_connections' from source: task vars 13273 1726853296.36497: variable 'controller_profile' from source: play vars 13273 1726853296.36564: variable 'controller_profile' from source: play vars 13273 1726853296.36573: variable 'controller_device' from source: play vars 13273 1726853296.36641: variable 'controller_device' from source: play vars 13273 1726853296.36652: variable 'port1_profile' from source: play vars 13273 1726853296.36737: variable 'port1_profile' from source: play vars 13273 1726853296.36740: variable 'dhcp_interface1' from source: play vars 13273 1726853296.36837: variable 'dhcp_interface1' from source: play vars 13273 1726853296.36840: variable 'controller_profile' from source: play vars 13273 1726853296.36911: variable 'controller_profile' from source: play vars 13273 1726853296.36919: variable 'port2_profile' from source: play vars 13273 1726853296.37030: variable 'port2_profile' from source: play vars 13273 1726853296.37033: variable 'dhcp_interface2' from source: play vars 13273 1726853296.37116: variable 'dhcp_interface2' from source: play vars 13273 1726853296.37120: variable 'controller_profile' from source: play vars 13273 1726853296.37190: variable 'controller_profile' from source: play vars 13273 1726853296.37228: variable '__network_packages_default_wireless' from source: role '' defaults 13273 1726853296.37288: variable '__network_wireless_connections_defined' from source: role '' defaults 13273 1726853296.37528: variable 'network_connections' from source: task vars 13273 1726853296.37531: variable 'controller_profile' from source: play vars 13273 1726853296.37581: variable 'controller_profile' from source: play vars 13273 1726853296.37587: variable 'controller_device' from source: play vars 13273 1726853296.37633: variable 'controller_device' from source: play vars 13273 1726853296.37640: variable 'port1_profile' from source: play vars 13273 1726853296.37689: variable 'port1_profile' from source: play vars 13273 1726853296.37695: variable 'dhcp_interface1' from source: play vars 13273 1726853296.37741: variable 'dhcp_interface1' from source: play vars 13273 1726853296.37749: variable 'controller_profile' from source: play vars 13273 1726853296.37795: variable 'controller_profile' from source: play vars 13273 1726853296.37800: variable 'port2_profile' from source: play vars 13273 1726853296.37849: variable 'port2_profile' from source: play vars 13273 1726853296.37855: variable 'dhcp_interface2' from source: play vars 13273 1726853296.37902: variable 'dhcp_interface2' from source: play vars 13273 1726853296.37907: variable 'controller_profile' from source: play vars 13273 1726853296.37956: variable 'controller_profile' from source: play vars 13273 1726853296.37976: variable '__network_packages_default_team' from source: role '' defaults 13273 1726853296.38028: variable '__network_team_connections_defined' from source: role '' defaults 13273 1726853296.38238: variable 'network_connections' from source: task vars 13273 1726853296.38242: variable 'controller_profile' from source: play vars 13273 1726853296.38292: variable 'controller_profile' from source: play vars 13273 1726853296.38298: variable 'controller_device' from source: play vars 13273 1726853296.38343: variable 'controller_device' from source: play vars 13273 1726853296.38352: variable 'port1_profile' from source: play vars 13273 1726853296.38399: variable 'port1_profile' from source: play vars 13273 1726853296.38405: variable 'dhcp_interface1' from source: play vars 13273 1726853296.38451: variable 'dhcp_interface1' from source: play vars 13273 1726853296.38456: variable 'controller_profile' from source: play vars 13273 1726853296.38504: variable 'controller_profile' from source: play vars 13273 1726853296.38510: variable 'port2_profile' from source: play vars 13273 1726853296.38556: variable 'port2_profile' from source: play vars 13273 1726853296.38561: variable 'dhcp_interface2' from source: play vars 13273 1726853296.38609: variable 'dhcp_interface2' from source: play vars 13273 1726853296.38614: variable 'controller_profile' from source: play vars 13273 1726853296.38661: variable 'controller_profile' from source: play vars 13273 1726853296.38705: variable '__network_service_name_default_initscripts' from source: role '' defaults 13273 1726853296.38747: variable '__network_service_name_default_initscripts' from source: role '' defaults 13273 1726853296.38755: variable '__network_packages_default_initscripts' from source: role '' defaults 13273 1726853296.38796: variable '__network_packages_default_initscripts' from source: role '' defaults 13273 1726853296.38930: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 13273 1726853296.39225: variable 'network_connections' from source: task vars 13273 1726853296.39228: variable 'controller_profile' from source: play vars 13273 1726853296.39276: variable 'controller_profile' from source: play vars 13273 1726853296.39283: variable 'controller_device' from source: play vars 13273 1726853296.39323: variable 'controller_device' from source: play vars 13273 1726853296.39330: variable 'port1_profile' from source: play vars 13273 1726853296.39376: variable 'port1_profile' from source: play vars 13273 1726853296.39382: variable 'dhcp_interface1' from source: play vars 13273 1726853296.39423: variable 'dhcp_interface1' from source: play vars 13273 1726853296.39429: variable 'controller_profile' from source: play vars 13273 1726853296.39475: variable 'controller_profile' from source: play vars 13273 1726853296.39481: variable 'port2_profile' from source: play vars 13273 1726853296.39522: variable 'port2_profile' from source: play vars 13273 1726853296.39528: variable 'dhcp_interface2' from source: play vars 13273 1726853296.39576: variable 'dhcp_interface2' from source: play vars 13273 1726853296.39579: variable 'controller_profile' from source: play vars 13273 1726853296.39621: variable 'controller_profile' from source: play vars 13273 1726853296.39624: variable 'ansible_distribution' from source: facts 13273 1726853296.39627: variable '__network_rh_distros' from source: role '' defaults 13273 1726853296.39633: variable 'ansible_distribution_major_version' from source: facts 13273 1726853296.39654: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 13273 1726853296.39774: variable 'ansible_distribution' from source: facts 13273 1726853296.39777: variable '__network_rh_distros' from source: role '' defaults 13273 1726853296.39782: variable 'ansible_distribution_major_version' from source: facts 13273 1726853296.39796: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 13273 1726853296.39905: variable 'ansible_distribution' from source: facts 13273 1726853296.39910: variable '__network_rh_distros' from source: role '' defaults 13273 1726853296.39912: variable 'ansible_distribution_major_version' from source: facts 13273 1726853296.39940: variable 'network_provider' from source: set_fact 13273 1726853296.39954: variable 'ansible_facts' from source: unknown 13273 1726853296.40313: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 13273 1726853296.40317: when evaluation is False, skipping this task 13273 1726853296.40319: _execute() done 13273 1726853296.40321: dumping result to json 13273 1726853296.40323: done dumping result, returning 13273 1726853296.40332: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages [02083763-bbaf-5fc3-657d-00000000002e] 13273 1726853296.40334: sending task result for task 02083763-bbaf-5fc3-657d-00000000002e 13273 1726853296.40420: done sending task result for task 02083763-bbaf-5fc3-657d-00000000002e 13273 1726853296.40423: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 13273 1726853296.40500: no more pending results, returning what we have 13273 1726853296.40503: results queue empty 13273 1726853296.40504: checking for any_errors_fatal 13273 1726853296.40508: done checking for any_errors_fatal 13273 1726853296.40509: checking for max_fail_percentage 13273 1726853296.40510: done checking for max_fail_percentage 13273 1726853296.40511: checking to see if all hosts have failed and the running result is not ok 13273 1726853296.40512: done checking to see if all hosts have failed 13273 1726853296.40512: getting the remaining hosts for this loop 13273 1726853296.40514: done getting the remaining hosts for this loop 13273 1726853296.40517: getting the next task for host managed_node3 13273 1726853296.40522: done getting next task for host managed_node3 13273 1726853296.40526: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 13273 1726853296.40529: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853296.40544: getting variables 13273 1726853296.40545: in VariableManager get_vars() 13273 1726853296.40595: Calling all_inventory to load vars for managed_node3 13273 1726853296.40616: Calling groups_inventory to load vars for managed_node3 13273 1726853296.40620: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853296.40629: Calling all_plugins_play to load vars for managed_node3 13273 1726853296.40632: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853296.40634: Calling groups_plugins_play to load vars for managed_node3 13273 1726853296.44748: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853296.45787: done with get_vars() 13273 1726853296.45801: done getting variables 13273 1726853296.45837: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 13:28:16 -0400 (0:00:00.210) 0:00:14.347 ****** 13273 1726853296.45860: entering _queue_task() for managed_node3/package 13273 1726853296.46127: worker is 1 (out of 1 available) 13273 1726853296.46140: exiting _queue_task() for managed_node3/package 13273 1726853296.46154: done queuing things up, now waiting for results queue to drain 13273 1726853296.46155: waiting for pending results... 13273 1726853296.46338: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 13273 1726853296.46425: in run() - task 02083763-bbaf-5fc3-657d-00000000002f 13273 1726853296.46436: variable 'ansible_search_path' from source: unknown 13273 1726853296.46440: variable 'ansible_search_path' from source: unknown 13273 1726853296.46468: calling self._execute() 13273 1726853296.46540: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853296.46547: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853296.46556: variable 'omit' from source: magic vars 13273 1726853296.46849: variable 'ansible_distribution_major_version' from source: facts 13273 1726853296.46856: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853296.46937: variable 'network_state' from source: role '' defaults 13273 1726853296.46948: Evaluated conditional (network_state != {}): False 13273 1726853296.46951: when evaluation is False, skipping this task 13273 1726853296.46955: _execute() done 13273 1726853296.46958: dumping result to json 13273 1726853296.46961: done dumping result, returning 13273 1726853296.46966: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [02083763-bbaf-5fc3-657d-00000000002f] 13273 1726853296.46973: sending task result for task 02083763-bbaf-5fc3-657d-00000000002f 13273 1726853296.47080: done sending task result for task 02083763-bbaf-5fc3-657d-00000000002f 13273 1726853296.47082: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 13273 1726853296.47129: no more pending results, returning what we have 13273 1726853296.47132: results queue empty 13273 1726853296.47133: checking for any_errors_fatal 13273 1726853296.47141: done checking for any_errors_fatal 13273 1726853296.47144: checking for max_fail_percentage 13273 1726853296.47145: done checking for max_fail_percentage 13273 1726853296.47146: checking to see if all hosts have failed and the running result is not ok 13273 1726853296.47147: done checking to see if all hosts have failed 13273 1726853296.47148: getting the remaining hosts for this loop 13273 1726853296.47150: done getting the remaining hosts for this loop 13273 1726853296.47153: getting the next task for host managed_node3 13273 1726853296.47159: done getting next task for host managed_node3 13273 1726853296.47163: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 13273 1726853296.47165: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853296.47186: getting variables 13273 1726853296.47187: in VariableManager get_vars() 13273 1726853296.47231: Calling all_inventory to load vars for managed_node3 13273 1726853296.47234: Calling groups_inventory to load vars for managed_node3 13273 1726853296.47236: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853296.47246: Calling all_plugins_play to load vars for managed_node3 13273 1726853296.47248: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853296.47251: Calling groups_plugins_play to load vars for managed_node3 13273 1726853296.48051: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853296.49067: done with get_vars() 13273 1726853296.49082: done getting variables 13273 1726853296.49121: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 13:28:16 -0400 (0:00:00.032) 0:00:14.380 ****** 13273 1726853296.49145: entering _queue_task() for managed_node3/package 13273 1726853296.49360: worker is 1 (out of 1 available) 13273 1726853296.49375: exiting _queue_task() for managed_node3/package 13273 1726853296.49387: done queuing things up, now waiting for results queue to drain 13273 1726853296.49388: waiting for pending results... 13273 1726853296.49551: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 13273 1726853296.49634: in run() - task 02083763-bbaf-5fc3-657d-000000000030 13273 1726853296.49648: variable 'ansible_search_path' from source: unknown 13273 1726853296.49651: variable 'ansible_search_path' from source: unknown 13273 1726853296.49689: calling self._execute() 13273 1726853296.49756: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853296.49762: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853296.49770: variable 'omit' from source: magic vars 13273 1726853296.50034: variable 'ansible_distribution_major_version' from source: facts 13273 1726853296.50047: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853296.50128: variable 'network_state' from source: role '' defaults 13273 1726853296.50135: Evaluated conditional (network_state != {}): False 13273 1726853296.50138: when evaluation is False, skipping this task 13273 1726853296.50140: _execute() done 13273 1726853296.50146: dumping result to json 13273 1726853296.50149: done dumping result, returning 13273 1726853296.50153: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [02083763-bbaf-5fc3-657d-000000000030] 13273 1726853296.50160: sending task result for task 02083763-bbaf-5fc3-657d-000000000030 13273 1726853296.50247: done sending task result for task 02083763-bbaf-5fc3-657d-000000000030 13273 1726853296.50251: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 13273 1726853296.50319: no more pending results, returning what we have 13273 1726853296.50322: results queue empty 13273 1726853296.50322: checking for any_errors_fatal 13273 1726853296.50327: done checking for any_errors_fatal 13273 1726853296.50327: checking for max_fail_percentage 13273 1726853296.50329: done checking for max_fail_percentage 13273 1726853296.50329: checking to see if all hosts have failed and the running result is not ok 13273 1726853296.50330: done checking to see if all hosts have failed 13273 1726853296.50331: getting the remaining hosts for this loop 13273 1726853296.50332: done getting the remaining hosts for this loop 13273 1726853296.50334: getting the next task for host managed_node3 13273 1726853296.50339: done getting next task for host managed_node3 13273 1726853296.50345: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 13273 1726853296.50348: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853296.50361: getting variables 13273 1726853296.50362: in VariableManager get_vars() 13273 1726853296.50404: Calling all_inventory to load vars for managed_node3 13273 1726853296.50406: Calling groups_inventory to load vars for managed_node3 13273 1726853296.50409: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853296.50420: Calling all_plugins_play to load vars for managed_node3 13273 1726853296.50424: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853296.50427: Calling groups_plugins_play to load vars for managed_node3 13273 1726853296.51384: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853296.52321: done with get_vars() 13273 1726853296.52335: done getting variables 13273 1726853296.52413: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 13:28:16 -0400 (0:00:00.032) 0:00:14.413 ****** 13273 1726853296.52433: entering _queue_task() for managed_node3/service 13273 1726853296.52434: Creating lock for service 13273 1726853296.52657: worker is 1 (out of 1 available) 13273 1726853296.52673: exiting _queue_task() for managed_node3/service 13273 1726853296.52686: done queuing things up, now waiting for results queue to drain 13273 1726853296.52687: waiting for pending results... 13273 1726853296.52891: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 13273 1726853296.52983: in run() - task 02083763-bbaf-5fc3-657d-000000000031 13273 1726853296.52994: variable 'ansible_search_path' from source: unknown 13273 1726853296.52997: variable 'ansible_search_path' from source: unknown 13273 1726853296.53035: calling self._execute() 13273 1726853296.53124: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853296.53128: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853296.53139: variable 'omit' from source: magic vars 13273 1726853296.53489: variable 'ansible_distribution_major_version' from source: facts 13273 1726853296.53498: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853296.53595: variable '__network_wireless_connections_defined' from source: role '' defaults 13273 1726853296.53758: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13273 1726853296.55366: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13273 1726853296.55421: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13273 1726853296.55450: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13273 1726853296.55476: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13273 1726853296.55499: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13273 1726853296.55554: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853296.55599: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853296.55603: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853296.55631: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853296.55641: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853296.55677: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853296.55695: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853296.55715: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853296.55741: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853296.55837: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853296.55841: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853296.55844: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853296.55857: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853296.55884: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853296.55895: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853296.56007: variable 'network_connections' from source: task vars 13273 1726853296.56016: variable 'controller_profile' from source: play vars 13273 1726853296.56065: variable 'controller_profile' from source: play vars 13273 1726853296.56074: variable 'controller_device' from source: play vars 13273 1726853296.56280: variable 'controller_device' from source: play vars 13273 1726853296.56284: variable 'port1_profile' from source: play vars 13273 1726853296.56287: variable 'port1_profile' from source: play vars 13273 1726853296.56289: variable 'dhcp_interface1' from source: play vars 13273 1726853296.56295: variable 'dhcp_interface1' from source: play vars 13273 1726853296.56307: variable 'controller_profile' from source: play vars 13273 1726853296.56375: variable 'controller_profile' from source: play vars 13273 1726853296.56381: variable 'port2_profile' from source: play vars 13273 1726853296.56425: variable 'port2_profile' from source: play vars 13273 1726853296.56431: variable 'dhcp_interface2' from source: play vars 13273 1726853296.56493: variable 'dhcp_interface2' from source: play vars 13273 1726853296.56499: variable 'controller_profile' from source: play vars 13273 1726853296.56649: variable 'controller_profile' from source: play vars 13273 1726853296.56670: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13273 1726853296.56803: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13273 1726853296.56828: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13273 1726853296.56853: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13273 1726853296.56877: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13273 1726853296.56907: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13273 1726853296.56975: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13273 1726853296.56979: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853296.56995: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13273 1726853296.57082: variable '__network_team_connections_defined' from source: role '' defaults 13273 1726853296.57286: variable 'network_connections' from source: task vars 13273 1726853296.57302: variable 'controller_profile' from source: play vars 13273 1726853296.57377: variable 'controller_profile' from source: play vars 13273 1726853296.57390: variable 'controller_device' from source: play vars 13273 1726853296.57533: variable 'controller_device' from source: play vars 13273 1726853296.57536: variable 'port1_profile' from source: play vars 13273 1726853296.57538: variable 'port1_profile' from source: play vars 13273 1726853296.57542: variable 'dhcp_interface1' from source: play vars 13273 1726853296.57585: variable 'dhcp_interface1' from source: play vars 13273 1726853296.57596: variable 'controller_profile' from source: play vars 13273 1726853296.57634: variable 'controller_profile' from source: play vars 13273 1726853296.57640: variable 'port2_profile' from source: play vars 13273 1726853296.57686: variable 'port2_profile' from source: play vars 13273 1726853296.57692: variable 'dhcp_interface2' from source: play vars 13273 1726853296.57736: variable 'dhcp_interface2' from source: play vars 13273 1726853296.57741: variable 'controller_profile' from source: play vars 13273 1726853296.57788: variable 'controller_profile' from source: play vars 13273 1726853296.57817: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 13273 1726853296.57821: when evaluation is False, skipping this task 13273 1726853296.57823: _execute() done 13273 1726853296.57826: dumping result to json 13273 1726853296.57828: done dumping result, returning 13273 1726853296.57831: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [02083763-bbaf-5fc3-657d-000000000031] 13273 1726853296.57833: sending task result for task 02083763-bbaf-5fc3-657d-000000000031 13273 1726853296.57920: done sending task result for task 02083763-bbaf-5fc3-657d-000000000031 13273 1726853296.57922: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 13273 1726853296.57995: no more pending results, returning what we have 13273 1726853296.57998: results queue empty 13273 1726853296.57999: checking for any_errors_fatal 13273 1726853296.58004: done checking for any_errors_fatal 13273 1726853296.58005: checking for max_fail_percentage 13273 1726853296.58007: done checking for max_fail_percentage 13273 1726853296.58007: checking to see if all hosts have failed and the running result is not ok 13273 1726853296.58008: done checking to see if all hosts have failed 13273 1726853296.58009: getting the remaining hosts for this loop 13273 1726853296.58010: done getting the remaining hosts for this loop 13273 1726853296.58013: getting the next task for host managed_node3 13273 1726853296.58018: done getting next task for host managed_node3 13273 1726853296.58021: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 13273 1726853296.58023: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853296.58038: getting variables 13273 1726853296.58039: in VariableManager get_vars() 13273 1726853296.58085: Calling all_inventory to load vars for managed_node3 13273 1726853296.58089: Calling groups_inventory to load vars for managed_node3 13273 1726853296.58091: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853296.58098: Calling all_plugins_play to load vars for managed_node3 13273 1726853296.58100: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853296.58102: Calling groups_plugins_play to load vars for managed_node3 13273 1726853296.59721: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853296.60682: done with get_vars() 13273 1726853296.60697: done getting variables 13273 1726853296.60737: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 13:28:16 -0400 (0:00:00.083) 0:00:14.496 ****** 13273 1726853296.60759: entering _queue_task() for managed_node3/service 13273 1726853296.60974: worker is 1 (out of 1 available) 13273 1726853296.60987: exiting _queue_task() for managed_node3/service 13273 1726853296.61000: done queuing things up, now waiting for results queue to drain 13273 1726853296.61001: waiting for pending results... 13273 1726853296.61178: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 13273 1726853296.61253: in run() - task 02083763-bbaf-5fc3-657d-000000000032 13273 1726853296.61265: variable 'ansible_search_path' from source: unknown 13273 1726853296.61268: variable 'ansible_search_path' from source: unknown 13273 1726853296.61297: calling self._execute() 13273 1726853296.61372: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853296.61376: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853296.61384: variable 'omit' from source: magic vars 13273 1726853296.61656: variable 'ansible_distribution_major_version' from source: facts 13273 1726853296.61670: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853296.61777: variable 'network_provider' from source: set_fact 13273 1726853296.61782: variable 'network_state' from source: role '' defaults 13273 1726853296.61785: Evaluated conditional (network_provider == "nm" or network_state != {}): True 13273 1726853296.61801: variable 'omit' from source: magic vars 13273 1726853296.61976: variable 'omit' from source: magic vars 13273 1726853296.61979: variable 'network_service_name' from source: role '' defaults 13273 1726853296.61981: variable 'network_service_name' from source: role '' defaults 13273 1726853296.62138: variable '__network_provider_setup' from source: role '' defaults 13273 1726853296.62153: variable '__network_service_name_default_nm' from source: role '' defaults 13273 1726853296.62218: variable '__network_service_name_default_nm' from source: role '' defaults 13273 1726853296.62233: variable '__network_packages_default_nm' from source: role '' defaults 13273 1726853296.62300: variable '__network_packages_default_nm' from source: role '' defaults 13273 1726853296.62502: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13273 1726853296.64532: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13273 1726853296.64613: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13273 1726853296.64657: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13273 1726853296.64698: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13273 1726853296.64726: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13273 1726853296.64807: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853296.64845: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853296.64880: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853296.64926: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853296.64949: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853296.65076: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853296.65080: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853296.65082: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853296.65100: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853296.65121: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853296.65348: variable '__network_packages_default_gobject_packages' from source: role '' defaults 13273 1726853296.65464: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853296.65495: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853296.65523: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853296.65567: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853296.65588: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853296.65681: variable 'ansible_python' from source: facts 13273 1726853296.65707: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 13273 1726853296.65790: variable '__network_wpa_supplicant_required' from source: role '' defaults 13273 1726853296.65870: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 13273 1726853296.66075: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853296.66079: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853296.66085: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853296.66099: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853296.66118: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853296.66169: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853296.66210: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853296.66240: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853296.66290: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853296.66310: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853296.66444: variable 'network_connections' from source: task vars 13273 1726853296.66456: variable 'controller_profile' from source: play vars 13273 1726853296.66527: variable 'controller_profile' from source: play vars 13273 1726853296.66678: variable 'controller_device' from source: play vars 13273 1726853296.66681: variable 'controller_device' from source: play vars 13273 1726853296.66684: variable 'port1_profile' from source: play vars 13273 1726853296.66710: variable 'port1_profile' from source: play vars 13273 1726853296.66725: variable 'dhcp_interface1' from source: play vars 13273 1726853296.66805: variable 'dhcp_interface1' from source: play vars 13273 1726853296.66821: variable 'controller_profile' from source: play vars 13273 1726853296.66899: variable 'controller_profile' from source: play vars 13273 1726853296.66914: variable 'port2_profile' from source: play vars 13273 1726853296.66991: variable 'port2_profile' from source: play vars 13273 1726853296.67007: variable 'dhcp_interface2' from source: play vars 13273 1726853296.67086: variable 'dhcp_interface2' from source: play vars 13273 1726853296.67102: variable 'controller_profile' from source: play vars 13273 1726853296.67179: variable 'controller_profile' from source: play vars 13273 1726853296.67280: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13273 1726853296.67480: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13273 1726853296.67534: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13273 1726853296.67591: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13273 1726853296.67637: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13273 1726853296.67706: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13273 1726853296.67740: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13273 1726853296.67782: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853296.67822: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13273 1726853296.67883: variable '__network_wireless_connections_defined' from source: role '' defaults 13273 1726853296.68167: variable 'network_connections' from source: task vars 13273 1726853296.68182: variable 'controller_profile' from source: play vars 13273 1726853296.68258: variable 'controller_profile' from source: play vars 13273 1726853296.68377: variable 'controller_device' from source: play vars 13273 1726853296.68381: variable 'controller_device' from source: play vars 13273 1726853296.68383: variable 'port1_profile' from source: play vars 13273 1726853296.68438: variable 'port1_profile' from source: play vars 13273 1726853296.68458: variable 'dhcp_interface1' from source: play vars 13273 1726853296.68532: variable 'dhcp_interface1' from source: play vars 13273 1726853296.68551: variable 'controller_profile' from source: play vars 13273 1726853296.68625: variable 'controller_profile' from source: play vars 13273 1726853296.68644: variable 'port2_profile' from source: play vars 13273 1726853296.68718: variable 'port2_profile' from source: play vars 13273 1726853296.68734: variable 'dhcp_interface2' from source: play vars 13273 1726853296.68810: variable 'dhcp_interface2' from source: play vars 13273 1726853296.68824: variable 'controller_profile' from source: play vars 13273 1726853296.68895: variable 'controller_profile' from source: play vars 13273 1726853296.68950: variable '__network_packages_default_wireless' from source: role '' defaults 13273 1726853296.69033: variable '__network_wireless_connections_defined' from source: role '' defaults 13273 1726853296.69324: variable 'network_connections' from source: task vars 13273 1726853296.69336: variable 'controller_profile' from source: play vars 13273 1726853296.69411: variable 'controller_profile' from source: play vars 13273 1726853296.69477: variable 'controller_device' from source: play vars 13273 1726853296.69499: variable 'controller_device' from source: play vars 13273 1726853296.69512: variable 'port1_profile' from source: play vars 13273 1726853296.69588: variable 'port1_profile' from source: play vars 13273 1726853296.69601: variable 'dhcp_interface1' from source: play vars 13273 1726853296.69674: variable 'dhcp_interface1' from source: play vars 13273 1726853296.69687: variable 'controller_profile' from source: play vars 13273 1726853296.69762: variable 'controller_profile' from source: play vars 13273 1726853296.69765: variable 'port2_profile' from source: play vars 13273 1726853296.69830: variable 'port2_profile' from source: play vars 13273 1726853296.69841: variable 'dhcp_interface2' from source: play vars 13273 1726853296.69922: variable 'dhcp_interface2' from source: play vars 13273 1726853296.69936: variable 'controller_profile' from source: play vars 13273 1726853296.70011: variable 'controller_profile' from source: play vars 13273 1726853296.70045: variable '__network_packages_default_team' from source: role '' defaults 13273 1726853296.70128: variable '__network_team_connections_defined' from source: role '' defaults 13273 1726853296.70449: variable 'network_connections' from source: task vars 13273 1726853296.70459: variable 'controller_profile' from source: play vars 13273 1726853296.70577: variable 'controller_profile' from source: play vars 13273 1726853296.70581: variable 'controller_device' from source: play vars 13273 1726853296.70620: variable 'controller_device' from source: play vars 13273 1726853296.70634: variable 'port1_profile' from source: play vars 13273 1726853296.70710: variable 'port1_profile' from source: play vars 13273 1726853296.70722: variable 'dhcp_interface1' from source: play vars 13273 1726853296.70797: variable 'dhcp_interface1' from source: play vars 13273 1726853296.70809: variable 'controller_profile' from source: play vars 13273 1726853296.70881: variable 'controller_profile' from source: play vars 13273 1726853296.70979: variable 'port2_profile' from source: play vars 13273 1726853296.70982: variable 'port2_profile' from source: play vars 13273 1726853296.70985: variable 'dhcp_interface2' from source: play vars 13273 1726853296.71050: variable 'dhcp_interface2' from source: play vars 13273 1726853296.71061: variable 'controller_profile' from source: play vars 13273 1726853296.71133: variable 'controller_profile' from source: play vars 13273 1726853296.71206: variable '__network_service_name_default_initscripts' from source: role '' defaults 13273 1726853296.71277: variable '__network_service_name_default_initscripts' from source: role '' defaults 13273 1726853296.71290: variable '__network_packages_default_initscripts' from source: role '' defaults 13273 1726853296.71351: variable '__network_packages_default_initscripts' from source: role '' defaults 13273 1726853296.71565: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 13273 1726853296.72056: variable 'network_connections' from source: task vars 13273 1726853296.72066: variable 'controller_profile' from source: play vars 13273 1726853296.72129: variable 'controller_profile' from source: play vars 13273 1726853296.72139: variable 'controller_device' from source: play vars 13273 1726853296.72199: variable 'controller_device' from source: play vars 13273 1726853296.72377: variable 'port1_profile' from source: play vars 13273 1726853296.72380: variable 'port1_profile' from source: play vars 13273 1726853296.72382: variable 'dhcp_interface1' from source: play vars 13273 1726853296.72383: variable 'dhcp_interface1' from source: play vars 13273 1726853296.72385: variable 'controller_profile' from source: play vars 13273 1726853296.72401: variable 'controller_profile' from source: play vars 13273 1726853296.72412: variable 'port2_profile' from source: play vars 13273 1726853296.72476: variable 'port2_profile' from source: play vars 13273 1726853296.72487: variable 'dhcp_interface2' from source: play vars 13273 1726853296.72549: variable 'dhcp_interface2' from source: play vars 13273 1726853296.72559: variable 'controller_profile' from source: play vars 13273 1726853296.72619: variable 'controller_profile' from source: play vars 13273 1726853296.72633: variable 'ansible_distribution' from source: facts 13273 1726853296.72639: variable '__network_rh_distros' from source: role '' defaults 13273 1726853296.72650: variable 'ansible_distribution_major_version' from source: facts 13273 1726853296.72734: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 13273 1726853296.72845: variable 'ansible_distribution' from source: facts 13273 1726853296.72854: variable '__network_rh_distros' from source: role '' defaults 13273 1726853296.72863: variable 'ansible_distribution_major_version' from source: facts 13273 1726853296.72881: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 13273 1726853296.73051: variable 'ansible_distribution' from source: facts 13273 1726853296.73065: variable '__network_rh_distros' from source: role '' defaults 13273 1726853296.73079: variable 'ansible_distribution_major_version' from source: facts 13273 1726853296.73119: variable 'network_provider' from source: set_fact 13273 1726853296.73150: variable 'omit' from source: magic vars 13273 1726853296.73189: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13273 1726853296.73277: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13273 1726853296.73280: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13273 1726853296.73283: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853296.73285: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853296.73311: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13273 1726853296.73319: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853296.73327: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853296.73433: Set connection var ansible_connection to ssh 13273 1726853296.73453: Set connection var ansible_module_compression to ZIP_DEFLATED 13273 1726853296.73464: Set connection var ansible_shell_executable to /bin/sh 13273 1726853296.73473: Set connection var ansible_shell_type to sh 13273 1726853296.73484: Set connection var ansible_pipelining to False 13273 1726853296.73499: Set connection var ansible_timeout to 10 13273 1726853296.73576: variable 'ansible_shell_executable' from source: unknown 13273 1726853296.73579: variable 'ansible_connection' from source: unknown 13273 1726853296.73581: variable 'ansible_module_compression' from source: unknown 13273 1726853296.73583: variable 'ansible_shell_type' from source: unknown 13273 1726853296.73585: variable 'ansible_shell_executable' from source: unknown 13273 1726853296.73587: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853296.73589: variable 'ansible_pipelining' from source: unknown 13273 1726853296.73591: variable 'ansible_timeout' from source: unknown 13273 1726853296.73593: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853296.73688: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13273 1726853296.73705: variable 'omit' from source: magic vars 13273 1726853296.73720: starting attempt loop 13273 1726853296.73727: running the handler 13273 1726853296.73804: variable 'ansible_facts' from source: unknown 13273 1726853296.74577: _low_level_execute_command(): starting 13273 1726853296.74580: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13273 1726853296.75378: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853296.75454: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853296.75478: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853296.75500: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853296.75603: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853296.77749: stdout chunk (state=3): >>>/root <<< 13273 1726853296.77753: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853296.77755: stdout chunk (state=3): >>><<< 13273 1726853296.77757: stderr chunk (state=3): >>><<< 13273 1726853296.77759: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853296.77766: _low_level_execute_command(): starting 13273 1726853296.77769: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853296.776632-14046-117551899780021 `" && echo ansible-tmp-1726853296.776632-14046-117551899780021="` echo /root/.ansible/tmp/ansible-tmp-1726853296.776632-14046-117551899780021 `" ) && sleep 0' 13273 1726853296.78595: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13273 1726853296.78608: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853296.78627: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853296.78650: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13273 1726853296.78668: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 13273 1726853296.78681: stderr chunk (state=3): >>>debug2: match not found <<< 13273 1726853296.78769: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853296.78791: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853296.78888: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853296.81239: stdout chunk (state=3): >>>ansible-tmp-1726853296.776632-14046-117551899780021=/root/.ansible/tmp/ansible-tmp-1726853296.776632-14046-117551899780021 <<< 13273 1726853296.81453: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853296.81457: stdout chunk (state=3): >>><<< 13273 1726853296.81459: stderr chunk (state=3): >>><<< 13273 1726853296.81462: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853296.776632-14046-117551899780021=/root/.ansible/tmp/ansible-tmp-1726853296.776632-14046-117551899780021 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853296.81464: variable 'ansible_module_compression' from source: unknown 13273 1726853296.81587: ANSIBALLZ: Using generic lock for ansible.legacy.systemd 13273 1726853296.81591: ANSIBALLZ: Acquiring lock 13273 1726853296.81593: ANSIBALLZ: Lock acquired: 140136094830320 13273 1726853296.81596: ANSIBALLZ: Creating module 13273 1726853297.23141: ANSIBALLZ: Writing module into payload 13273 1726853297.23332: ANSIBALLZ: Writing module 13273 1726853297.23373: ANSIBALLZ: Renaming module 13273 1726853297.23398: ANSIBALLZ: Done creating module 13273 1726853297.23442: variable 'ansible_facts' from source: unknown 13273 1726853297.23696: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853296.776632-14046-117551899780021/AnsiballZ_systemd.py 13273 1726853297.23963: Sending initial data 13273 1726853297.23966: Sent initial data (155 bytes) 13273 1726853297.24983: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853297.24987: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853297.24989: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853297.26606: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 13273 1726853297.26611: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13273 1726853297.26665: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13273 1726853297.26718: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13273pukop8ph/tmpi8jdumc0 /root/.ansible/tmp/ansible-tmp-1726853296.776632-14046-117551899780021/AnsiballZ_systemd.py <<< 13273 1726853297.26721: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853296.776632-14046-117551899780021/AnsiballZ_systemd.py" <<< 13273 1726853297.26778: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13273pukop8ph/tmpi8jdumc0" to remote "/root/.ansible/tmp/ansible-tmp-1726853296.776632-14046-117551899780021/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853296.776632-14046-117551899780021/AnsiballZ_systemd.py" <<< 13273 1726853297.28195: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853297.28198: stdout chunk (state=3): >>><<< 13273 1726853297.28200: stderr chunk (state=3): >>><<< 13273 1726853297.28222: done transferring module to remote 13273 1726853297.28238: _low_level_execute_command(): starting 13273 1726853297.28266: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853296.776632-14046-117551899780021/ /root/.ansible/tmp/ansible-tmp-1726853296.776632-14046-117551899780021/AnsiballZ_systemd.py && sleep 0' 13273 1726853297.28684: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853297.28709: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13273 1726853297.28713: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 13273 1726853297.28715: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853297.28718: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853297.28720: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13273 1726853297.28731: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853297.28781: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853297.28802: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853297.28804: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853297.28856: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853297.30728: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853297.30755: stderr chunk (state=3): >>><<< 13273 1726853297.30759: stdout chunk (state=3): >>><<< 13273 1726853297.30774: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853297.30777: _low_level_execute_command(): starting 13273 1726853297.30782: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853296.776632-14046-117551899780021/AnsiballZ_systemd.py && sleep 0' 13273 1726853297.31217: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853297.31220: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 13273 1726853297.31222: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 13273 1726853297.31224: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853297.31226: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853297.31285: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853297.31289: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853297.31291: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853297.31397: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853297.61183: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "705", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:21:20 EDT", "ExecMainStartTimestampMonotonic": "24298536", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:21:20 EDT", "ExecMainHandoffTimestampMonotonic": "24318182", "ExecMainPID": "705", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10366976", "MemoryPeak": "14114816", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3327475712", "EffectiveMemoryMax": "3702865920", "EffectiveMemoryHigh": "3702865920", "CPUUsageNSec": "889129000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpRe<<< 13273 1726853297.61223: stdout chunk (state=3): >>>ceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice sysinit.target dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "network.target shutdown.target cloud-init.service NetworkManager-wait-online.service multi-user.target", "After": "sysinit.target systemd-journald.socket basic.target cloud-init-local.service network-pre.target dbus.socket system.slice dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:26:57 EDT", "StateChangeTimestampMonotonic": "361843458", "InactiveExitTimestamp": "Fri 2024-09-20 13:21:20 EDT", "InactiveExitTimestampMonotonic": "24299070", "ActiveEnterTimestamp": "Fri 2024-09-20 13:21:20 EDT", "ActiveEnterTimestampMonotonic": "24855925", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:21:20 EDT", "ConditionTimestampMonotonic": "24297535", "AssertTimestamp": "Fri 2024-09-20 13:21:20 EDT", "AssertTimestampMonotonic": "24297537", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "125a1bdc44cb4bffa8aeca788d2f2fa3", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 13273 1726853297.63690: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 13273 1726853297.63693: stdout chunk (state=3): >>><<< 13273 1726853297.63696: stderr chunk (state=3): >>><<< 13273 1726853297.63700: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "705", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:21:20 EDT", "ExecMainStartTimestampMonotonic": "24298536", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:21:20 EDT", "ExecMainHandoffTimestampMonotonic": "24318182", "ExecMainPID": "705", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10366976", "MemoryPeak": "14114816", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3327475712", "EffectiveMemoryMax": "3702865920", "EffectiveMemoryHigh": "3702865920", "CPUUsageNSec": "889129000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice sysinit.target dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "network.target shutdown.target cloud-init.service NetworkManager-wait-online.service multi-user.target", "After": "sysinit.target systemd-journald.socket basic.target cloud-init-local.service network-pre.target dbus.socket system.slice dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:26:57 EDT", "StateChangeTimestampMonotonic": "361843458", "InactiveExitTimestamp": "Fri 2024-09-20 13:21:20 EDT", "InactiveExitTimestampMonotonic": "24299070", "ActiveEnterTimestamp": "Fri 2024-09-20 13:21:20 EDT", "ActiveEnterTimestampMonotonic": "24855925", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:21:20 EDT", "ConditionTimestampMonotonic": "24297535", "AssertTimestamp": "Fri 2024-09-20 13:21:20 EDT", "AssertTimestampMonotonic": "24297537", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "125a1bdc44cb4bffa8aeca788d2f2fa3", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 13273 1726853297.63989: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853296.776632-14046-117551899780021/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13273 1726853297.64023: _low_level_execute_command(): starting 13273 1726853297.64033: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853296.776632-14046-117551899780021/ > /dev/null 2>&1 && sleep 0' 13273 1726853297.65139: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13273 1726853297.65186: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853297.65233: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853297.65251: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853297.65264: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853297.65348: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853297.67506: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853297.67512: stdout chunk (state=3): >>><<< 13273 1726853297.67515: stderr chunk (state=3): >>><<< 13273 1726853297.67517: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853297.67519: handler run complete 13273 1726853297.67781: attempt loop complete, returning result 13273 1726853297.67785: _execute() done 13273 1726853297.67787: dumping result to json 13273 1726853297.67789: done dumping result, returning 13273 1726853297.67792: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [02083763-bbaf-5fc3-657d-000000000032] 13273 1726853297.67794: sending task result for task 02083763-bbaf-5fc3-657d-000000000032 13273 1726853297.68385: done sending task result for task 02083763-bbaf-5fc3-657d-000000000032 13273 1726853297.68389: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 13273 1726853297.68432: no more pending results, returning what we have 13273 1726853297.68435: results queue empty 13273 1726853297.68435: checking for any_errors_fatal 13273 1726853297.68440: done checking for any_errors_fatal 13273 1726853297.68441: checking for max_fail_percentage 13273 1726853297.68445: done checking for max_fail_percentage 13273 1726853297.68446: checking to see if all hosts have failed and the running result is not ok 13273 1726853297.68446: done checking to see if all hosts have failed 13273 1726853297.68447: getting the remaining hosts for this loop 13273 1726853297.68448: done getting the remaining hosts for this loop 13273 1726853297.68451: getting the next task for host managed_node3 13273 1726853297.68456: done getting next task for host managed_node3 13273 1726853297.68459: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 13273 1726853297.68463: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853297.68474: getting variables 13273 1726853297.68475: in VariableManager get_vars() 13273 1726853297.68518: Calling all_inventory to load vars for managed_node3 13273 1726853297.68522: Calling groups_inventory to load vars for managed_node3 13273 1726853297.68524: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853297.68532: Calling all_plugins_play to load vars for managed_node3 13273 1726853297.68534: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853297.68537: Calling groups_plugins_play to load vars for managed_node3 13273 1726853297.70006: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853297.72340: done with get_vars() 13273 1726853297.72430: done getting variables 13273 1726853297.72501: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 13:28:17 -0400 (0:00:01.117) 0:00:15.616 ****** 13273 1726853297.72698: entering _queue_task() for managed_node3/service 13273 1726853297.73440: worker is 1 (out of 1 available) 13273 1726853297.73487: exiting _queue_task() for managed_node3/service 13273 1726853297.73499: done queuing things up, now waiting for results queue to drain 13273 1726853297.73528: waiting for pending results... 13273 1726853297.73688: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 13273 1726853297.73804: in run() - task 02083763-bbaf-5fc3-657d-000000000033 13273 1726853297.73817: variable 'ansible_search_path' from source: unknown 13273 1726853297.73829: variable 'ansible_search_path' from source: unknown 13273 1726853297.73867: calling self._execute() 13273 1726853297.73951: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853297.73956: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853297.73968: variable 'omit' from source: magic vars 13273 1726853297.74270: variable 'ansible_distribution_major_version' from source: facts 13273 1726853297.74282: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853297.74366: variable 'network_provider' from source: set_fact 13273 1726853297.74370: Evaluated conditional (network_provider == "nm"): True 13273 1726853297.74436: variable '__network_wpa_supplicant_required' from source: role '' defaults 13273 1726853297.74501: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 13273 1726853297.74622: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13273 1726853297.76810: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13273 1726853297.76825: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13273 1726853297.76865: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13273 1726853297.77050: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13273 1726853297.77084: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13273 1726853297.77176: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853297.77244: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853297.77249: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853297.77301: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853297.77320: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853297.77378: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853297.77462: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853297.77465: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853297.77485: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853297.77504: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853297.77544: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853297.77584: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853297.77612: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853297.77654: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853297.77685: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853297.77846: variable 'network_connections' from source: task vars 13273 1726853297.77855: variable 'controller_profile' from source: play vars 13273 1726853297.77932: variable 'controller_profile' from source: play vars 13273 1726853297.77940: variable 'controller_device' from source: play vars 13273 1726853297.77986: variable 'controller_device' from source: play vars 13273 1726853297.77994: variable 'port1_profile' from source: play vars 13273 1726853297.78041: variable 'port1_profile' from source: play vars 13273 1726853297.78049: variable 'dhcp_interface1' from source: play vars 13273 1726853297.78096: variable 'dhcp_interface1' from source: play vars 13273 1726853297.78102: variable 'controller_profile' from source: play vars 13273 1726853297.78149: variable 'controller_profile' from source: play vars 13273 1726853297.78155: variable 'port2_profile' from source: play vars 13273 1726853297.78197: variable 'port2_profile' from source: play vars 13273 1726853297.78203: variable 'dhcp_interface2' from source: play vars 13273 1726853297.78250: variable 'dhcp_interface2' from source: play vars 13273 1726853297.78255: variable 'controller_profile' from source: play vars 13273 1726853297.78310: variable 'controller_profile' from source: play vars 13273 1726853297.78363: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13273 1726853297.78475: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13273 1726853297.78501: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13273 1726853297.78523: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13273 1726853297.78545: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13273 1726853297.78579: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13273 1726853297.78595: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13273 1726853297.78612: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853297.78635: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13273 1726853297.78678: variable '__network_wireless_connections_defined' from source: role '' defaults 13273 1726853297.78826: variable 'network_connections' from source: task vars 13273 1726853297.78830: variable 'controller_profile' from source: play vars 13273 1726853297.78880: variable 'controller_profile' from source: play vars 13273 1726853297.78883: variable 'controller_device' from source: play vars 13273 1726853297.78925: variable 'controller_device' from source: play vars 13273 1726853297.78932: variable 'port1_profile' from source: play vars 13273 1726853297.78975: variable 'port1_profile' from source: play vars 13273 1726853297.78983: variable 'dhcp_interface1' from source: play vars 13273 1726853297.79026: variable 'dhcp_interface1' from source: play vars 13273 1726853297.79031: variable 'controller_profile' from source: play vars 13273 1726853297.79075: variable 'controller_profile' from source: play vars 13273 1726853297.79081: variable 'port2_profile' from source: play vars 13273 1726853297.79125: variable 'port2_profile' from source: play vars 13273 1726853297.79131: variable 'dhcp_interface2' from source: play vars 13273 1726853297.79176: variable 'dhcp_interface2' from source: play vars 13273 1726853297.79181: variable 'controller_profile' from source: play vars 13273 1726853297.79224: variable 'controller_profile' from source: play vars 13273 1726853297.79256: Evaluated conditional (__network_wpa_supplicant_required): False 13273 1726853297.79260: when evaluation is False, skipping this task 13273 1726853297.79263: _execute() done 13273 1726853297.79265: dumping result to json 13273 1726853297.79267: done dumping result, returning 13273 1726853297.79274: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [02083763-bbaf-5fc3-657d-000000000033] 13273 1726853297.79280: sending task result for task 02083763-bbaf-5fc3-657d-000000000033 13273 1726853297.79361: done sending task result for task 02083763-bbaf-5fc3-657d-000000000033 13273 1726853297.79363: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 13273 1726853297.79411: no more pending results, returning what we have 13273 1726853297.79414: results queue empty 13273 1726853297.79415: checking for any_errors_fatal 13273 1726853297.79431: done checking for any_errors_fatal 13273 1726853297.79432: checking for max_fail_percentage 13273 1726853297.79433: done checking for max_fail_percentage 13273 1726853297.79434: checking to see if all hosts have failed and the running result is not ok 13273 1726853297.79435: done checking to see if all hosts have failed 13273 1726853297.79436: getting the remaining hosts for this loop 13273 1726853297.79437: done getting the remaining hosts for this loop 13273 1726853297.79440: getting the next task for host managed_node3 13273 1726853297.79445: done getting next task for host managed_node3 13273 1726853297.79449: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 13273 1726853297.79452: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853297.79464: getting variables 13273 1726853297.79466: in VariableManager get_vars() 13273 1726853297.79519: Calling all_inventory to load vars for managed_node3 13273 1726853297.79522: Calling groups_inventory to load vars for managed_node3 13273 1726853297.79524: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853297.79533: Calling all_plugins_play to load vars for managed_node3 13273 1726853297.79535: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853297.79538: Calling groups_plugins_play to load vars for managed_node3 13273 1726853297.80889: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853297.81787: done with get_vars() 13273 1726853297.81808: done getting variables 13273 1726853297.81860: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 13:28:17 -0400 (0:00:00.091) 0:00:15.708 ****** 13273 1726853297.81885: entering _queue_task() for managed_node3/service 13273 1726853297.82133: worker is 1 (out of 1 available) 13273 1726853297.82147: exiting _queue_task() for managed_node3/service 13273 1726853297.82159: done queuing things up, now waiting for results queue to drain 13273 1726853297.82160: waiting for pending results... 13273 1726853297.82342: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service 13273 1726853297.82433: in run() - task 02083763-bbaf-5fc3-657d-000000000034 13273 1726853297.82443: variable 'ansible_search_path' from source: unknown 13273 1726853297.82449: variable 'ansible_search_path' from source: unknown 13273 1726853297.82481: calling self._execute() 13273 1726853297.82553: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853297.82557: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853297.82566: variable 'omit' from source: magic vars 13273 1726853297.82841: variable 'ansible_distribution_major_version' from source: facts 13273 1726853297.83076: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853297.83080: variable 'network_provider' from source: set_fact 13273 1726853297.83082: Evaluated conditional (network_provider == "initscripts"): False 13273 1726853297.83083: when evaluation is False, skipping this task 13273 1726853297.83085: _execute() done 13273 1726853297.83087: dumping result to json 13273 1726853297.83088: done dumping result, returning 13273 1726853297.83090: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service [02083763-bbaf-5fc3-657d-000000000034] 13273 1726853297.83092: sending task result for task 02083763-bbaf-5fc3-657d-000000000034 13273 1726853297.83151: done sending task result for task 02083763-bbaf-5fc3-657d-000000000034 13273 1726853297.83153: WORKER PROCESS EXITING skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 13273 1726853297.83198: no more pending results, returning what we have 13273 1726853297.83201: results queue empty 13273 1726853297.83202: checking for any_errors_fatal 13273 1726853297.83208: done checking for any_errors_fatal 13273 1726853297.83209: checking for max_fail_percentage 13273 1726853297.83210: done checking for max_fail_percentage 13273 1726853297.83211: checking to see if all hosts have failed and the running result is not ok 13273 1726853297.83211: done checking to see if all hosts have failed 13273 1726853297.83212: getting the remaining hosts for this loop 13273 1726853297.83213: done getting the remaining hosts for this loop 13273 1726853297.83216: getting the next task for host managed_node3 13273 1726853297.83220: done getting next task for host managed_node3 13273 1726853297.83224: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 13273 1726853297.83226: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853297.83239: getting variables 13273 1726853297.83240: in VariableManager get_vars() 13273 1726853297.83285: Calling all_inventory to load vars for managed_node3 13273 1726853297.83288: Calling groups_inventory to load vars for managed_node3 13273 1726853297.83290: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853297.83297: Calling all_plugins_play to load vars for managed_node3 13273 1726853297.83299: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853297.83302: Calling groups_plugins_play to load vars for managed_node3 13273 1726853297.84488: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853297.85710: done with get_vars() 13273 1726853297.85725: done getting variables 13273 1726853297.85772: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 13:28:17 -0400 (0:00:00.039) 0:00:15.747 ****** 13273 1726853297.85795: entering _queue_task() for managed_node3/copy 13273 1726853297.86030: worker is 1 (out of 1 available) 13273 1726853297.86044: exiting _queue_task() for managed_node3/copy 13273 1726853297.86057: done queuing things up, now waiting for results queue to drain 13273 1726853297.86058: waiting for pending results... 13273 1726853297.86237: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 13273 1726853297.86332: in run() - task 02083763-bbaf-5fc3-657d-000000000035 13273 1726853297.86343: variable 'ansible_search_path' from source: unknown 13273 1726853297.86348: variable 'ansible_search_path' from source: unknown 13273 1726853297.86379: calling self._execute() 13273 1726853297.86452: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853297.86456: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853297.86465: variable 'omit' from source: magic vars 13273 1726853297.86740: variable 'ansible_distribution_major_version' from source: facts 13273 1726853297.86752: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853297.86829: variable 'network_provider' from source: set_fact 13273 1726853297.86834: Evaluated conditional (network_provider == "initscripts"): False 13273 1726853297.86837: when evaluation is False, skipping this task 13273 1726853297.86840: _execute() done 13273 1726853297.86843: dumping result to json 13273 1726853297.86845: done dumping result, returning 13273 1726853297.86856: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [02083763-bbaf-5fc3-657d-000000000035] 13273 1726853297.86859: sending task result for task 02083763-bbaf-5fc3-657d-000000000035 13273 1726853297.86946: done sending task result for task 02083763-bbaf-5fc3-657d-000000000035 13273 1726853297.86949: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 13273 1726853297.86997: no more pending results, returning what we have 13273 1726853297.87001: results queue empty 13273 1726853297.87001: checking for any_errors_fatal 13273 1726853297.87007: done checking for any_errors_fatal 13273 1726853297.87008: checking for max_fail_percentage 13273 1726853297.87009: done checking for max_fail_percentage 13273 1726853297.87010: checking to see if all hosts have failed and the running result is not ok 13273 1726853297.87011: done checking to see if all hosts have failed 13273 1726853297.87012: getting the remaining hosts for this loop 13273 1726853297.87013: done getting the remaining hosts for this loop 13273 1726853297.87016: getting the next task for host managed_node3 13273 1726853297.87023: done getting next task for host managed_node3 13273 1726853297.87026: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 13273 1726853297.87028: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853297.87042: getting variables 13273 1726853297.87044: in VariableManager get_vars() 13273 1726853297.87095: Calling all_inventory to load vars for managed_node3 13273 1726853297.87098: Calling groups_inventory to load vars for managed_node3 13273 1726853297.87100: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853297.87108: Calling all_plugins_play to load vars for managed_node3 13273 1726853297.87110: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853297.87112: Calling groups_plugins_play to load vars for managed_node3 13273 1726853297.87853: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853297.88703: done with get_vars() 13273 1726853297.88719: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 13:28:17 -0400 (0:00:00.029) 0:00:15.776 ****** 13273 1726853297.88781: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 13273 1726853297.88783: Creating lock for fedora.linux_system_roles.network_connections 13273 1726853297.89030: worker is 1 (out of 1 available) 13273 1726853297.89043: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 13273 1726853297.89056: done queuing things up, now waiting for results queue to drain 13273 1726853297.89057: waiting for pending results... 13273 1726853297.89239: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 13273 1726853297.89325: in run() - task 02083763-bbaf-5fc3-657d-000000000036 13273 1726853297.89337: variable 'ansible_search_path' from source: unknown 13273 1726853297.89340: variable 'ansible_search_path' from source: unknown 13273 1726853297.89373: calling self._execute() 13273 1726853297.89449: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853297.89455: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853297.89464: variable 'omit' from source: magic vars 13273 1726853297.89740: variable 'ansible_distribution_major_version' from source: facts 13273 1726853297.89752: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853297.89758: variable 'omit' from source: magic vars 13273 1726853297.89803: variable 'omit' from source: magic vars 13273 1726853297.89918: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13273 1726853297.91358: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13273 1726853297.91401: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13273 1726853297.91427: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13273 1726853297.91454: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13273 1726853297.91477: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13273 1726853297.91529: variable 'network_provider' from source: set_fact 13273 1726853297.91621: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853297.91652: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853297.91669: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853297.91700: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853297.91711: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853297.91763: variable 'omit' from source: magic vars 13273 1726853297.91841: variable 'omit' from source: magic vars 13273 1726853297.91913: variable 'network_connections' from source: task vars 13273 1726853297.91923: variable 'controller_profile' from source: play vars 13273 1726853297.91966: variable 'controller_profile' from source: play vars 13273 1726853297.91974: variable 'controller_device' from source: play vars 13273 1726853297.92018: variable 'controller_device' from source: play vars 13273 1726853297.92025: variable 'port1_profile' from source: play vars 13273 1726853297.92068: variable 'port1_profile' from source: play vars 13273 1726853297.92076: variable 'dhcp_interface1' from source: play vars 13273 1726853297.92130: variable 'dhcp_interface1' from source: play vars 13273 1726853297.92135: variable 'controller_profile' from source: play vars 13273 1726853297.92179: variable 'controller_profile' from source: play vars 13273 1726853297.92185: variable 'port2_profile' from source: play vars 13273 1726853297.92229: variable 'port2_profile' from source: play vars 13273 1726853297.92235: variable 'dhcp_interface2' from source: play vars 13273 1726853297.92283: variable 'dhcp_interface2' from source: play vars 13273 1726853297.92288: variable 'controller_profile' from source: play vars 13273 1726853297.92334: variable 'controller_profile' from source: play vars 13273 1726853297.92455: variable 'omit' from source: magic vars 13273 1726853297.92462: variable '__lsr_ansible_managed' from source: task vars 13273 1726853297.92505: variable '__lsr_ansible_managed' from source: task vars 13273 1726853297.92622: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 13273 1726853297.92989: Loaded config def from plugin (lookup/template) 13273 1726853297.92992: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 13273 1726853297.93012: File lookup term: get_ansible_managed.j2 13273 1726853297.93015: variable 'ansible_search_path' from source: unknown 13273 1726853297.93018: evaluation_path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 13273 1726853297.93030: search_path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 13273 1726853297.93043: variable 'ansible_search_path' from source: unknown 13273 1726853297.96161: variable 'ansible_managed' from source: unknown 13273 1726853297.96235: variable 'omit' from source: magic vars 13273 1726853297.96260: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13273 1726853297.96281: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13273 1726853297.96294: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13273 1726853297.96307: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853297.96315: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853297.96339: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13273 1726853297.96342: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853297.96344: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853297.96405: Set connection var ansible_connection to ssh 13273 1726853297.96413: Set connection var ansible_module_compression to ZIP_DEFLATED 13273 1726853297.96418: Set connection var ansible_shell_executable to /bin/sh 13273 1726853297.96420: Set connection var ansible_shell_type to sh 13273 1726853297.96431: Set connection var ansible_pipelining to False 13273 1726853297.96434: Set connection var ansible_timeout to 10 13273 1726853297.96454: variable 'ansible_shell_executable' from source: unknown 13273 1726853297.96457: variable 'ansible_connection' from source: unknown 13273 1726853297.96459: variable 'ansible_module_compression' from source: unknown 13273 1726853297.96462: variable 'ansible_shell_type' from source: unknown 13273 1726853297.96464: variable 'ansible_shell_executable' from source: unknown 13273 1726853297.96466: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853297.96469: variable 'ansible_pipelining' from source: unknown 13273 1726853297.96473: variable 'ansible_timeout' from source: unknown 13273 1726853297.96478: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853297.96599: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 13273 1726853297.96607: variable 'omit' from source: magic vars 13273 1726853297.96614: starting attempt loop 13273 1726853297.96616: running the handler 13273 1726853297.96628: _low_level_execute_command(): starting 13273 1726853297.96635: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13273 1726853297.97145: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853297.97149: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853297.97152: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853297.97154: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853297.97198: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853297.97202: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853297.97218: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853297.97298: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853297.99022: stdout chunk (state=3): >>>/root <<< 13273 1726853297.99177: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853297.99180: stdout chunk (state=3): >>><<< 13273 1726853297.99182: stderr chunk (state=3): >>><<< 13273 1726853297.99297: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853297.99302: _low_level_execute_command(): starting 13273 1726853297.99305: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853297.9921174-14115-133651064055387 `" && echo ansible-tmp-1726853297.9921174-14115-133651064055387="` echo /root/.ansible/tmp/ansible-tmp-1726853297.9921174-14115-133651064055387 `" ) && sleep 0' 13273 1726853297.99818: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853297.99863: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853297.99888: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853297.99952: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853298.02178: stdout chunk (state=3): >>>ansible-tmp-1726853297.9921174-14115-133651064055387=/root/.ansible/tmp/ansible-tmp-1726853297.9921174-14115-133651064055387 <<< 13273 1726853298.02207: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853298.02211: stdout chunk (state=3): >>><<< 13273 1726853298.02218: stderr chunk (state=3): >>><<< 13273 1726853298.02237: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853297.9921174-14115-133651064055387=/root/.ansible/tmp/ansible-tmp-1726853297.9921174-14115-133651064055387 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853298.02319: variable 'ansible_module_compression' from source: unknown 13273 1726853298.02367: ANSIBALLZ: Using lock for fedora.linux_system_roles.network_connections 13273 1726853298.02372: ANSIBALLZ: Acquiring lock 13273 1726853298.02375: ANSIBALLZ: Lock acquired: 140136090703760 13273 1726853298.02484: ANSIBALLZ: Creating module 13273 1726853298.42311: ANSIBALLZ: Writing module into payload 13273 1726853298.42658: ANSIBALLZ: Writing module 13273 1726853298.42691: ANSIBALLZ: Renaming module 13273 1726853298.42704: ANSIBALLZ: Done creating module 13273 1726853298.42736: variable 'ansible_facts' from source: unknown 13273 1726853298.42883: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853297.9921174-14115-133651064055387/AnsiballZ_network_connections.py 13273 1726853298.43094: Sending initial data 13273 1726853298.43097: Sent initial data (168 bytes) 13273 1726853298.43725: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13273 1726853298.43739: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853298.43759: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853298.43841: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853298.43881: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853298.43895: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853298.43919: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853298.44018: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853298.45697: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13273 1726853298.45772: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13273 1726853298.45932: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13273pukop8ph/tmpd2rn6mru /root/.ansible/tmp/ansible-tmp-1726853297.9921174-14115-133651064055387/AnsiballZ_network_connections.py <<< 13273 1726853298.45935: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853297.9921174-14115-133651064055387/AnsiballZ_network_connections.py" <<< 13273 1726853298.46010: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13273pukop8ph/tmpd2rn6mru" to remote "/root/.ansible/tmp/ansible-tmp-1726853297.9921174-14115-133651064055387/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853297.9921174-14115-133651064055387/AnsiballZ_network_connections.py" <<< 13273 1726853298.47226: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853298.47363: stderr chunk (state=3): >>><<< 13273 1726853298.47367: stdout chunk (state=3): >>><<< 13273 1726853298.47369: done transferring module to remote 13273 1726853298.47373: _low_level_execute_command(): starting 13273 1726853298.47376: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853297.9921174-14115-133651064055387/ /root/.ansible/tmp/ansible-tmp-1726853297.9921174-14115-133651064055387/AnsiballZ_network_connections.py && sleep 0' 13273 1726853298.47990: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13273 1726853298.48004: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853298.48026: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853298.48044: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13273 1726853298.48059: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 13273 1726853298.48134: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853298.48169: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853298.48186: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853298.48205: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853298.48287: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853298.50189: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853298.50232: stderr chunk (state=3): >>><<< 13273 1726853298.50235: stdout chunk (state=3): >>><<< 13273 1726853298.50253: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853298.50256: _low_level_execute_command(): starting 13273 1726853298.50261: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853297.9921174-14115-133651064055387/AnsiballZ_network_connections.py && sleep 0' 13273 1726853298.50844: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13273 1726853298.50851: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853298.50862: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853298.50958: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853298.50976: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853298.51073: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853298.96659: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, a541bcb3-3a19-4ffe-82d9-f7e984395e25\n[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 4ad0cc7b-d5cb-4e1d-92ec-ada243ff0f9d\n[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 889074c2-782b-4057-b1eb-a43c769be906\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, a541bcb3-3a19-4ffe-82d9-f7e984395e25 (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 4ad0cc7b-d5cb-4e1d-92ec-ada243ff0f9d (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 889074c2-782b-4057-b1eb-a43c769be906 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "active-backup", "miimon": 110}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "active-backup", "miimon": 110}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 13273 1726853298.98726: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 13273 1726853298.98753: stderr chunk (state=3): >>><<< 13273 1726853298.98756: stdout chunk (state=3): >>><<< 13273 1726853298.98775: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, a541bcb3-3a19-4ffe-82d9-f7e984395e25\n[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 4ad0cc7b-d5cb-4e1d-92ec-ada243ff0f9d\n[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 889074c2-782b-4057-b1eb-a43c769be906\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, a541bcb3-3a19-4ffe-82d9-f7e984395e25 (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 4ad0cc7b-d5cb-4e1d-92ec-ada243ff0f9d (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 889074c2-782b-4057-b1eb-a43c769be906 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "active-backup", "miimon": 110}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "active-backup", "miimon": 110}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 13273 1726853298.98820: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'bond0', 'state': 'up', 'type': 'bond', 'interface_name': 'nm-bond', 'bond': {'mode': 'active-backup', 'miimon': 110}, 'ip': {'route_metric4': 65535}}, {'name': 'bond0.0', 'state': 'up', 'type': 'ethernet', 'interface_name': 'test1', 'controller': 'bond0'}, {'name': 'bond0.1', 'state': 'up', 'type': 'ethernet', 'interface_name': 'test2', 'controller': 'bond0'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853297.9921174-14115-133651064055387/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13273 1726853298.98828: _low_level_execute_command(): starting 13273 1726853298.98833: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853297.9921174-14115-133651064055387/ > /dev/null 2>&1 && sleep 0' 13273 1726853298.99273: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853298.99302: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13273 1726853298.99305: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 13273 1726853298.99309: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853298.99311: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853298.99313: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853298.99360: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853298.99364: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853298.99380: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853298.99440: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853299.01387: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853299.01411: stderr chunk (state=3): >>><<< 13273 1726853299.01414: stdout chunk (state=3): >>><<< 13273 1726853299.01428: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853299.01434: handler run complete 13273 1726853299.01464: attempt loop complete, returning result 13273 1726853299.01468: _execute() done 13273 1726853299.01472: dumping result to json 13273 1726853299.01478: done dumping result, returning 13273 1726853299.01487: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [02083763-bbaf-5fc3-657d-000000000036] 13273 1726853299.01492: sending task result for task 02083763-bbaf-5fc3-657d-000000000036 13273 1726853299.01606: done sending task result for task 02083763-bbaf-5fc3-657d-000000000036 13273 1726853299.01608: WORKER PROCESS EXITING changed: [managed_node3] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "bond": { "miimon": 110, "mode": "active-backup" }, "interface_name": "nm-bond", "ip": { "route_metric4": 65535 }, "name": "bond0", "state": "up", "type": "bond" }, { "controller": "bond0", "interface_name": "test1", "name": "bond0.0", "state": "up", "type": "ethernet" }, { "controller": "bond0", "interface_name": "test2", "name": "bond0.1", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [007] #0, state:up persistent_state:present, 'bond0': add connection bond0, a541bcb3-3a19-4ffe-82d9-f7e984395e25 [008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 4ad0cc7b-d5cb-4e1d-92ec-ada243ff0f9d [009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 889074c2-782b-4057-b1eb-a43c769be906 [010] #0, state:up persistent_state:present, 'bond0': up connection bond0, a541bcb3-3a19-4ffe-82d9-f7e984395e25 (is-modified) [011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 4ad0cc7b-d5cb-4e1d-92ec-ada243ff0f9d (not-active) [012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 889074c2-782b-4057-b1eb-a43c769be906 (not-active) 13273 1726853299.01725: no more pending results, returning what we have 13273 1726853299.01728: results queue empty 13273 1726853299.01729: checking for any_errors_fatal 13273 1726853299.01736: done checking for any_errors_fatal 13273 1726853299.01737: checking for max_fail_percentage 13273 1726853299.01738: done checking for max_fail_percentage 13273 1726853299.01739: checking to see if all hosts have failed and the running result is not ok 13273 1726853299.01740: done checking to see if all hosts have failed 13273 1726853299.01740: getting the remaining hosts for this loop 13273 1726853299.01742: done getting the remaining hosts for this loop 13273 1726853299.01745: getting the next task for host managed_node3 13273 1726853299.01750: done getting next task for host managed_node3 13273 1726853299.01753: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 13273 1726853299.01756: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853299.01765: getting variables 13273 1726853299.01767: in VariableManager get_vars() 13273 1726853299.01818: Calling all_inventory to load vars for managed_node3 13273 1726853299.01822: Calling groups_inventory to load vars for managed_node3 13273 1726853299.01824: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853299.01832: Calling all_plugins_play to load vars for managed_node3 13273 1726853299.01834: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853299.01841: Calling groups_plugins_play to load vars for managed_node3 13273 1726853299.02797: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853299.03692: done with get_vars() 13273 1726853299.03707: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 13:28:19 -0400 (0:00:01.149) 0:00:16.926 ****** 13273 1726853299.03770: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_state 13273 1726853299.03773: Creating lock for fedora.linux_system_roles.network_state 13273 1726853299.04011: worker is 1 (out of 1 available) 13273 1726853299.04026: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_state 13273 1726853299.04038: done queuing things up, now waiting for results queue to drain 13273 1726853299.04039: waiting for pending results... 13273 1726853299.04210: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state 13273 1726853299.04306: in run() - task 02083763-bbaf-5fc3-657d-000000000037 13273 1726853299.04317: variable 'ansible_search_path' from source: unknown 13273 1726853299.04320: variable 'ansible_search_path' from source: unknown 13273 1726853299.04350: calling self._execute() 13273 1726853299.04421: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853299.04425: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853299.04433: variable 'omit' from source: magic vars 13273 1726853299.04708: variable 'ansible_distribution_major_version' from source: facts 13273 1726853299.04718: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853299.04801: variable 'network_state' from source: role '' defaults 13273 1726853299.04811: Evaluated conditional (network_state != {}): False 13273 1726853299.04815: when evaluation is False, skipping this task 13273 1726853299.04817: _execute() done 13273 1726853299.04820: dumping result to json 13273 1726853299.04822: done dumping result, returning 13273 1726853299.04825: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state [02083763-bbaf-5fc3-657d-000000000037] 13273 1726853299.04833: sending task result for task 02083763-bbaf-5fc3-657d-000000000037 13273 1726853299.04912: done sending task result for task 02083763-bbaf-5fc3-657d-000000000037 13273 1726853299.04917: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 13273 1726853299.04976: no more pending results, returning what we have 13273 1726853299.04979: results queue empty 13273 1726853299.04980: checking for any_errors_fatal 13273 1726853299.04995: done checking for any_errors_fatal 13273 1726853299.04996: checking for max_fail_percentage 13273 1726853299.04997: done checking for max_fail_percentage 13273 1726853299.04998: checking to see if all hosts have failed and the running result is not ok 13273 1726853299.04999: done checking to see if all hosts have failed 13273 1726853299.04999: getting the remaining hosts for this loop 13273 1726853299.05001: done getting the remaining hosts for this loop 13273 1726853299.05004: getting the next task for host managed_node3 13273 1726853299.05009: done getting next task for host managed_node3 13273 1726853299.05012: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 13273 1726853299.05015: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853299.05030: getting variables 13273 1726853299.05031: in VariableManager get_vars() 13273 1726853299.05084: Calling all_inventory to load vars for managed_node3 13273 1726853299.05087: Calling groups_inventory to load vars for managed_node3 13273 1726853299.05089: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853299.05098: Calling all_plugins_play to load vars for managed_node3 13273 1726853299.05100: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853299.05102: Calling groups_plugins_play to load vars for managed_node3 13273 1726853299.06454: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853299.07365: done with get_vars() 13273 1726853299.07383: done getting variables 13273 1726853299.07428: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 13:28:19 -0400 (0:00:00.036) 0:00:16.963 ****** 13273 1726853299.07451: entering _queue_task() for managed_node3/debug 13273 1726853299.07680: worker is 1 (out of 1 available) 13273 1726853299.07694: exiting _queue_task() for managed_node3/debug 13273 1726853299.07704: done queuing things up, now waiting for results queue to drain 13273 1726853299.07705: waiting for pending results... 13273 1726853299.07889: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 13273 1726853299.07976: in run() - task 02083763-bbaf-5fc3-657d-000000000038 13273 1726853299.07988: variable 'ansible_search_path' from source: unknown 13273 1726853299.07992: variable 'ansible_search_path' from source: unknown 13273 1726853299.08020: calling self._execute() 13273 1726853299.08094: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853299.08099: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853299.08107: variable 'omit' from source: magic vars 13273 1726853299.08380: variable 'ansible_distribution_major_version' from source: facts 13273 1726853299.08406: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853299.08410: variable 'omit' from source: magic vars 13273 1726853299.08510: variable 'omit' from source: magic vars 13273 1726853299.08513: variable 'omit' from source: magic vars 13273 1726853299.08531: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13273 1726853299.08575: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13273 1726853299.08587: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13273 1726853299.08604: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853299.08618: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853299.08648: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13273 1726853299.08651: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853299.08654: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853299.08750: Set connection var ansible_connection to ssh 13273 1726853299.08762: Set connection var ansible_module_compression to ZIP_DEFLATED 13273 1726853299.08765: Set connection var ansible_shell_executable to /bin/sh 13273 1726853299.08767: Set connection var ansible_shell_type to sh 13273 1726853299.08774: Set connection var ansible_pipelining to False 13273 1726853299.08780: Set connection var ansible_timeout to 10 13273 1726853299.08806: variable 'ansible_shell_executable' from source: unknown 13273 1726853299.08809: variable 'ansible_connection' from source: unknown 13273 1726853299.08812: variable 'ansible_module_compression' from source: unknown 13273 1726853299.08814: variable 'ansible_shell_type' from source: unknown 13273 1726853299.08816: variable 'ansible_shell_executable' from source: unknown 13273 1726853299.08818: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853299.08822: variable 'ansible_pipelining' from source: unknown 13273 1726853299.08826: variable 'ansible_timeout' from source: unknown 13273 1726853299.08829: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853299.08963: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13273 1726853299.09051: variable 'omit' from source: magic vars 13273 1726853299.09056: starting attempt loop 13273 1726853299.09058: running the handler 13273 1726853299.09104: variable '__network_connections_result' from source: set_fact 13273 1726853299.09169: handler run complete 13273 1726853299.09184: attempt loop complete, returning result 13273 1726853299.09187: _execute() done 13273 1726853299.09190: dumping result to json 13273 1726853299.09192: done dumping result, returning 13273 1726853299.09202: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [02083763-bbaf-5fc3-657d-000000000038] 13273 1726853299.09205: sending task result for task 02083763-bbaf-5fc3-657d-000000000038 13273 1726853299.09343: done sending task result for task 02083763-bbaf-5fc3-657d-000000000038 13273 1726853299.09346: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result.stderr_lines": [ "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, a541bcb3-3a19-4ffe-82d9-f7e984395e25", "[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 4ad0cc7b-d5cb-4e1d-92ec-ada243ff0f9d", "[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 889074c2-782b-4057-b1eb-a43c769be906", "[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, a541bcb3-3a19-4ffe-82d9-f7e984395e25 (is-modified)", "[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 4ad0cc7b-d5cb-4e1d-92ec-ada243ff0f9d (not-active)", "[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 889074c2-782b-4057-b1eb-a43c769be906 (not-active)" ] } 13273 1726853299.09446: no more pending results, returning what we have 13273 1726853299.09449: results queue empty 13273 1726853299.09450: checking for any_errors_fatal 13273 1726853299.09455: done checking for any_errors_fatal 13273 1726853299.09455: checking for max_fail_percentage 13273 1726853299.09457: done checking for max_fail_percentage 13273 1726853299.09458: checking to see if all hosts have failed and the running result is not ok 13273 1726853299.09458: done checking to see if all hosts have failed 13273 1726853299.09459: getting the remaining hosts for this loop 13273 1726853299.09460: done getting the remaining hosts for this loop 13273 1726853299.09463: getting the next task for host managed_node3 13273 1726853299.09468: done getting next task for host managed_node3 13273 1726853299.09472: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 13273 1726853299.09475: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853299.09484: getting variables 13273 1726853299.09485: in VariableManager get_vars() 13273 1726853299.09528: Calling all_inventory to load vars for managed_node3 13273 1726853299.09531: Calling groups_inventory to load vars for managed_node3 13273 1726853299.09533: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853299.09541: Calling all_plugins_play to load vars for managed_node3 13273 1726853299.09543: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853299.09546: Calling groups_plugins_play to load vars for managed_node3 13273 1726853299.10927: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853299.12487: done with get_vars() 13273 1726853299.12516: done getting variables 13273 1726853299.12577: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 13:28:19 -0400 (0:00:00.051) 0:00:17.015 ****** 13273 1726853299.12621: entering _queue_task() for managed_node3/debug 13273 1726853299.12926: worker is 1 (out of 1 available) 13273 1726853299.13052: exiting _queue_task() for managed_node3/debug 13273 1726853299.13065: done queuing things up, now waiting for results queue to drain 13273 1726853299.13066: waiting for pending results... 13273 1726853299.13284: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 13273 1726853299.13487: in run() - task 02083763-bbaf-5fc3-657d-000000000039 13273 1726853299.13491: variable 'ansible_search_path' from source: unknown 13273 1726853299.13494: variable 'ansible_search_path' from source: unknown 13273 1726853299.13497: calling self._execute() 13273 1726853299.13582: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853299.13599: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853299.13613: variable 'omit' from source: magic vars 13273 1726853299.14180: variable 'ansible_distribution_major_version' from source: facts 13273 1726853299.14183: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853299.14186: variable 'omit' from source: magic vars 13273 1726853299.14188: variable 'omit' from source: magic vars 13273 1726853299.14191: variable 'omit' from source: magic vars 13273 1726853299.14193: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13273 1726853299.14196: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13273 1726853299.14220: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13273 1726853299.14242: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853299.14259: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853299.14300: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13273 1726853299.14308: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853299.14316: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853299.14419: Set connection var ansible_connection to ssh 13273 1726853299.14435: Set connection var ansible_module_compression to ZIP_DEFLATED 13273 1726853299.14446: Set connection var ansible_shell_executable to /bin/sh 13273 1726853299.14453: Set connection var ansible_shell_type to sh 13273 1726853299.14464: Set connection var ansible_pipelining to False 13273 1726853299.14476: Set connection var ansible_timeout to 10 13273 1726853299.14512: variable 'ansible_shell_executable' from source: unknown 13273 1726853299.14520: variable 'ansible_connection' from source: unknown 13273 1726853299.14527: variable 'ansible_module_compression' from source: unknown 13273 1726853299.14534: variable 'ansible_shell_type' from source: unknown 13273 1726853299.14541: variable 'ansible_shell_executable' from source: unknown 13273 1726853299.14548: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853299.14557: variable 'ansible_pipelining' from source: unknown 13273 1726853299.14564: variable 'ansible_timeout' from source: unknown 13273 1726853299.14574: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853299.14721: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13273 1726853299.14741: variable 'omit' from source: magic vars 13273 1726853299.14752: starting attempt loop 13273 1726853299.14760: running the handler 13273 1726853299.14813: variable '__network_connections_result' from source: set_fact 13273 1726853299.14900: variable '__network_connections_result' from source: set_fact 13273 1726853299.15159: handler run complete 13273 1726853299.15162: attempt loop complete, returning result 13273 1726853299.15165: _execute() done 13273 1726853299.15167: dumping result to json 13273 1726853299.15169: done dumping result, returning 13273 1726853299.15174: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [02083763-bbaf-5fc3-657d-000000000039] 13273 1726853299.15177: sending task result for task 02083763-bbaf-5fc3-657d-000000000039 ok: [managed_node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "bond": { "miimon": 110, "mode": "active-backup" }, "interface_name": "nm-bond", "ip": { "route_metric4": 65535 }, "name": "bond0", "state": "up", "type": "bond" }, { "controller": "bond0", "interface_name": "test1", "name": "bond0.0", "state": "up", "type": "ethernet" }, { "controller": "bond0", "interface_name": "test2", "name": "bond0.1", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, a541bcb3-3a19-4ffe-82d9-f7e984395e25\n[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 4ad0cc7b-d5cb-4e1d-92ec-ada243ff0f9d\n[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 889074c2-782b-4057-b1eb-a43c769be906\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, a541bcb3-3a19-4ffe-82d9-f7e984395e25 (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 4ad0cc7b-d5cb-4e1d-92ec-ada243ff0f9d (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 889074c2-782b-4057-b1eb-a43c769be906 (not-active)\n", "stderr_lines": [ "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, a541bcb3-3a19-4ffe-82d9-f7e984395e25", "[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 4ad0cc7b-d5cb-4e1d-92ec-ada243ff0f9d", "[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 889074c2-782b-4057-b1eb-a43c769be906", "[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, a541bcb3-3a19-4ffe-82d9-f7e984395e25 (is-modified)", "[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 4ad0cc7b-d5cb-4e1d-92ec-ada243ff0f9d (not-active)", "[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 889074c2-782b-4057-b1eb-a43c769be906 (not-active)" ] } } 13273 1726853299.15364: no more pending results, returning what we have 13273 1726853299.15368: results queue empty 13273 1726853299.15369: checking for any_errors_fatal 13273 1726853299.15378: done checking for any_errors_fatal 13273 1726853299.15379: checking for max_fail_percentage 13273 1726853299.15386: done checking for max_fail_percentage 13273 1726853299.15388: checking to see if all hosts have failed and the running result is not ok 13273 1726853299.15388: done checking to see if all hosts have failed 13273 1726853299.15389: getting the remaining hosts for this loop 13273 1726853299.15390: done getting the remaining hosts for this loop 13273 1726853299.15394: getting the next task for host managed_node3 13273 1726853299.15399: done getting next task for host managed_node3 13273 1726853299.15403: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 13273 1726853299.15407: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853299.15418: getting variables 13273 1726853299.15421: in VariableManager get_vars() 13273 1726853299.15577: Calling all_inventory to load vars for managed_node3 13273 1726853299.15584: Calling groups_inventory to load vars for managed_node3 13273 1726853299.15587: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853299.15598: Calling all_plugins_play to load vars for managed_node3 13273 1726853299.15601: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853299.15604: Calling groups_plugins_play to load vars for managed_node3 13273 1726853299.16203: done sending task result for task 02083763-bbaf-5fc3-657d-000000000039 13273 1726853299.16206: WORKER PROCESS EXITING 13273 1726853299.17155: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853299.18707: done with get_vars() 13273 1726853299.18736: done getting variables 13273 1726853299.18793: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 13:28:19 -0400 (0:00:00.062) 0:00:17.077 ****** 13273 1726853299.18823: entering _queue_task() for managed_node3/debug 13273 1726853299.19140: worker is 1 (out of 1 available) 13273 1726853299.19152: exiting _queue_task() for managed_node3/debug 13273 1726853299.19163: done queuing things up, now waiting for results queue to drain 13273 1726853299.19164: waiting for pending results... 13273 1726853299.19448: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 13273 1726853299.19583: in run() - task 02083763-bbaf-5fc3-657d-00000000003a 13273 1726853299.19612: variable 'ansible_search_path' from source: unknown 13273 1726853299.19620: variable 'ansible_search_path' from source: unknown 13273 1726853299.19660: calling self._execute() 13273 1726853299.19758: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853299.19773: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853299.19790: variable 'omit' from source: magic vars 13273 1726853299.20189: variable 'ansible_distribution_major_version' from source: facts 13273 1726853299.20205: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853299.20333: variable 'network_state' from source: role '' defaults 13273 1726853299.20349: Evaluated conditional (network_state != {}): False 13273 1726853299.20357: when evaluation is False, skipping this task 13273 1726853299.20375: _execute() done 13273 1726853299.20383: dumping result to json 13273 1726853299.20391: done dumping result, returning 13273 1726853299.20403: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [02083763-bbaf-5fc3-657d-00000000003a] 13273 1726853299.20475: sending task result for task 02083763-bbaf-5fc3-657d-00000000003a 13273 1726853299.20540: done sending task result for task 02083763-bbaf-5fc3-657d-00000000003a 13273 1726853299.20543: WORKER PROCESS EXITING skipping: [managed_node3] => { "false_condition": "network_state != {}" } 13273 1726853299.20594: no more pending results, returning what we have 13273 1726853299.20598: results queue empty 13273 1726853299.20599: checking for any_errors_fatal 13273 1726853299.20612: done checking for any_errors_fatal 13273 1726853299.20613: checking for max_fail_percentage 13273 1726853299.20615: done checking for max_fail_percentage 13273 1726853299.20616: checking to see if all hosts have failed and the running result is not ok 13273 1726853299.20617: done checking to see if all hosts have failed 13273 1726853299.20617: getting the remaining hosts for this loop 13273 1726853299.20620: done getting the remaining hosts for this loop 13273 1726853299.20624: getting the next task for host managed_node3 13273 1726853299.20630: done getting next task for host managed_node3 13273 1726853299.20635: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 13273 1726853299.20638: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853299.20653: getting variables 13273 1726853299.20655: in VariableManager get_vars() 13273 1726853299.20712: Calling all_inventory to load vars for managed_node3 13273 1726853299.20715: Calling groups_inventory to load vars for managed_node3 13273 1726853299.20717: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853299.20727: Calling all_plugins_play to load vars for managed_node3 13273 1726853299.20729: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853299.20732: Calling groups_plugins_play to load vars for managed_node3 13273 1726853299.22333: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853299.23884: done with get_vars() 13273 1726853299.23903: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 13:28:19 -0400 (0:00:00.051) 0:00:17.129 ****** 13273 1726853299.23998: entering _queue_task() for managed_node3/ping 13273 1726853299.24000: Creating lock for ping 13273 1726853299.24389: worker is 1 (out of 1 available) 13273 1726853299.24401: exiting _queue_task() for managed_node3/ping 13273 1726853299.24412: done queuing things up, now waiting for results queue to drain 13273 1726853299.24413: waiting for pending results... 13273 1726853299.24796: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 13273 1726853299.24807: in run() - task 02083763-bbaf-5fc3-657d-00000000003b 13273 1726853299.24827: variable 'ansible_search_path' from source: unknown 13273 1726853299.24833: variable 'ansible_search_path' from source: unknown 13273 1726853299.24874: calling self._execute() 13273 1726853299.24976: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853299.25000: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853299.25103: variable 'omit' from source: magic vars 13273 1726853299.25451: variable 'ansible_distribution_major_version' from source: facts 13273 1726853299.25469: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853299.25484: variable 'omit' from source: magic vars 13273 1726853299.25552: variable 'omit' from source: magic vars 13273 1726853299.25596: variable 'omit' from source: magic vars 13273 1726853299.25648: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13273 1726853299.25696: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13273 1726853299.25722: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13273 1726853299.25745: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853299.25773: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853299.25811: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13273 1726853299.25821: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853299.25830: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853299.25979: Set connection var ansible_connection to ssh 13273 1726853299.25983: Set connection var ansible_module_compression to ZIP_DEFLATED 13273 1726853299.25985: Set connection var ansible_shell_executable to /bin/sh 13273 1726853299.25986: Set connection var ansible_shell_type to sh 13273 1726853299.25989: Set connection var ansible_pipelining to False 13273 1726853299.25991: Set connection var ansible_timeout to 10 13273 1726853299.26019: variable 'ansible_shell_executable' from source: unknown 13273 1726853299.26028: variable 'ansible_connection' from source: unknown 13273 1726853299.26037: variable 'ansible_module_compression' from source: unknown 13273 1726853299.26044: variable 'ansible_shell_type' from source: unknown 13273 1726853299.26089: variable 'ansible_shell_executable' from source: unknown 13273 1726853299.26092: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853299.26095: variable 'ansible_pipelining' from source: unknown 13273 1726853299.26097: variable 'ansible_timeout' from source: unknown 13273 1726853299.26099: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853299.26307: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 13273 1726853299.26325: variable 'omit' from source: magic vars 13273 1726853299.26376: starting attempt loop 13273 1726853299.26380: running the handler 13273 1726853299.26383: _low_level_execute_command(): starting 13273 1726853299.26386: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13273 1726853299.27122: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13273 1726853299.27137: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853299.27151: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853299.27189: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853299.27291: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853299.27544: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853299.27614: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853299.29577: stdout chunk (state=3): >>>/root <<< 13273 1726853299.29581: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853299.29583: stdout chunk (state=3): >>><<< 13273 1726853299.29585: stderr chunk (state=3): >>><<< 13273 1726853299.29589: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853299.29591: _low_level_execute_command(): starting 13273 1726853299.29594: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853299.29511-14172-141989516072563 `" && echo ansible-tmp-1726853299.29511-14172-141989516072563="` echo /root/.ansible/tmp/ansible-tmp-1726853299.29511-14172-141989516072563 `" ) && sleep 0' 13273 1726853299.30086: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853299.30103: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853299.30118: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853299.30173: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853299.30190: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853299.30254: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853299.32261: stdout chunk (state=3): >>>ansible-tmp-1726853299.29511-14172-141989516072563=/root/.ansible/tmp/ansible-tmp-1726853299.29511-14172-141989516072563 <<< 13273 1726853299.32362: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853299.32384: stderr chunk (state=3): >>><<< 13273 1726853299.32387: stdout chunk (state=3): >>><<< 13273 1726853299.32403: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853299.29511-14172-141989516072563=/root/.ansible/tmp/ansible-tmp-1726853299.29511-14172-141989516072563 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853299.32445: variable 'ansible_module_compression' from source: unknown 13273 1726853299.32481: ANSIBALLZ: Using lock for ping 13273 1726853299.32484: ANSIBALLZ: Acquiring lock 13273 1726853299.32486: ANSIBALLZ: Lock acquired: 140136091893632 13273 1726853299.32489: ANSIBALLZ: Creating module 13273 1726853299.42729: ANSIBALLZ: Writing module into payload 13273 1726853299.42733: ANSIBALLZ: Writing module 13273 1726853299.42750: ANSIBALLZ: Renaming module 13273 1726853299.42756: ANSIBALLZ: Done creating module 13273 1726853299.42774: variable 'ansible_facts' from source: unknown 13273 1726853299.42834: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853299.29511-14172-141989516072563/AnsiballZ_ping.py 13273 1726853299.43058: Sending initial data 13273 1726853299.43061: Sent initial data (151 bytes) 13273 1726853299.43687: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853299.43761: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853299.43780: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853299.43805: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853299.44093: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853299.45692: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 13273 1726853299.45697: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13273 1726853299.45779: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13273 1726853299.45910: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13273pukop8ph/tmpbk9p3_p1 /root/.ansible/tmp/ansible-tmp-1726853299.29511-14172-141989516072563/AnsiballZ_ping.py <<< 13273 1726853299.45914: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853299.29511-14172-141989516072563/AnsiballZ_ping.py" <<< 13273 1726853299.46030: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13273pukop8ph/tmpbk9p3_p1" to remote "/root/.ansible/tmp/ansible-tmp-1726853299.29511-14172-141989516072563/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853299.29511-14172-141989516072563/AnsiballZ_ping.py" <<< 13273 1726853299.47256: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853299.47276: stderr chunk (state=3): >>><<< 13273 1726853299.47478: stdout chunk (state=3): >>><<< 13273 1726853299.47484: done transferring module to remote 13273 1726853299.47486: _low_level_execute_command(): starting 13273 1726853299.47489: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853299.29511-14172-141989516072563/ /root/.ansible/tmp/ansible-tmp-1726853299.29511-14172-141989516072563/AnsiballZ_ping.py && sleep 0' 13273 1726853299.48519: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13273 1726853299.48593: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853299.48758: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853299.48785: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853299.48935: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853299.50829: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853299.50855: stderr chunk (state=3): >>><<< 13273 1726853299.50858: stdout chunk (state=3): >>><<< 13273 1726853299.50873: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853299.50876: _low_level_execute_command(): starting 13273 1726853299.50881: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853299.29511-14172-141989516072563/AnsiballZ_ping.py && sleep 0' 13273 1726853299.51293: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853299.51296: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853299.51298: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration <<< 13273 1726853299.51301: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13273 1726853299.51303: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853299.51356: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853299.51360: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853299.51428: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853299.66924: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 13273 1726853299.68298: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 13273 1726853299.68326: stderr chunk (state=3): >>><<< 13273 1726853299.68329: stdout chunk (state=3): >>><<< 13273 1726853299.68348: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 13273 1726853299.68374: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853299.29511-14172-141989516072563/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13273 1726853299.68381: _low_level_execute_command(): starting 13273 1726853299.68383: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853299.29511-14172-141989516072563/ > /dev/null 2>&1 && sleep 0' 13273 1726853299.68844: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853299.68848: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853299.68850: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration <<< 13273 1726853299.68852: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13273 1726853299.68854: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853299.68906: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853299.68910: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853299.68974: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853299.70881: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853299.70909: stderr chunk (state=3): >>><<< 13273 1726853299.70912: stdout chunk (state=3): >>><<< 13273 1726853299.70926: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853299.70935: handler run complete 13273 1726853299.70952: attempt loop complete, returning result 13273 1726853299.70955: _execute() done 13273 1726853299.70957: dumping result to json 13273 1726853299.70959: done dumping result, returning 13273 1726853299.70968: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [02083763-bbaf-5fc3-657d-00000000003b] 13273 1726853299.70973: sending task result for task 02083763-bbaf-5fc3-657d-00000000003b 13273 1726853299.71061: done sending task result for task 02083763-bbaf-5fc3-657d-00000000003b 13273 1726853299.71064: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "ping": "pong" } 13273 1726853299.71154: no more pending results, returning what we have 13273 1726853299.71157: results queue empty 13273 1726853299.71158: checking for any_errors_fatal 13273 1726853299.71164: done checking for any_errors_fatal 13273 1726853299.71164: checking for max_fail_percentage 13273 1726853299.71166: done checking for max_fail_percentage 13273 1726853299.71167: checking to see if all hosts have failed and the running result is not ok 13273 1726853299.71167: done checking to see if all hosts have failed 13273 1726853299.71168: getting the remaining hosts for this loop 13273 1726853299.71169: done getting the remaining hosts for this loop 13273 1726853299.71175: getting the next task for host managed_node3 13273 1726853299.71183: done getting next task for host managed_node3 13273 1726853299.71185: ^ task is: TASK: meta (role_complete) 13273 1726853299.71188: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853299.71197: getting variables 13273 1726853299.71199: in VariableManager get_vars() 13273 1726853299.71248: Calling all_inventory to load vars for managed_node3 13273 1726853299.71251: Calling groups_inventory to load vars for managed_node3 13273 1726853299.71253: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853299.71262: Calling all_plugins_play to load vars for managed_node3 13273 1726853299.71265: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853299.71267: Calling groups_plugins_play to load vars for managed_node3 13273 1726853299.72095: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853299.73028: done with get_vars() 13273 1726853299.73046: done getting variables 13273 1726853299.73104: done queuing things up, now waiting for results queue to drain 13273 1726853299.73106: results queue empty 13273 1726853299.73106: checking for any_errors_fatal 13273 1726853299.73108: done checking for any_errors_fatal 13273 1726853299.73108: checking for max_fail_percentage 13273 1726853299.73109: done checking for max_fail_percentage 13273 1726853299.73110: checking to see if all hosts have failed and the running result is not ok 13273 1726853299.73110: done checking to see if all hosts have failed 13273 1726853299.73110: getting the remaining hosts for this loop 13273 1726853299.73111: done getting the remaining hosts for this loop 13273 1726853299.73113: getting the next task for host managed_node3 13273 1726853299.73118: done getting next task for host managed_node3 13273 1726853299.73120: ^ task is: TASK: Include the task 'get_interface_stat.yml' 13273 1726853299.73121: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853299.73123: getting variables 13273 1726853299.73124: in VariableManager get_vars() 13273 1726853299.73137: Calling all_inventory to load vars for managed_node3 13273 1726853299.73138: Calling groups_inventory to load vars for managed_node3 13273 1726853299.73139: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853299.73143: Calling all_plugins_play to load vars for managed_node3 13273 1726853299.73145: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853299.73147: Calling groups_plugins_play to load vars for managed_node3 13273 1726853299.73772: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853299.74615: done with get_vars() 13273 1726853299.74632: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Friday 20 September 2024 13:28:19 -0400 (0:00:00.506) 0:00:17.636 ****** 13273 1726853299.74686: entering _queue_task() for managed_node3/include_tasks 13273 1726853299.74936: worker is 1 (out of 1 available) 13273 1726853299.74948: exiting _queue_task() for managed_node3/include_tasks 13273 1726853299.74958: done queuing things up, now waiting for results queue to drain 13273 1726853299.74959: waiting for pending results... 13273 1726853299.75143: running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' 13273 1726853299.75231: in run() - task 02083763-bbaf-5fc3-657d-00000000006e 13273 1726853299.75241: variable 'ansible_search_path' from source: unknown 13273 1726853299.75248: variable 'ansible_search_path' from source: unknown 13273 1726853299.75277: calling self._execute() 13273 1726853299.75354: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853299.75360: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853299.75368: variable 'omit' from source: magic vars 13273 1726853299.75662: variable 'ansible_distribution_major_version' from source: facts 13273 1726853299.75673: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853299.75678: _execute() done 13273 1726853299.75682: dumping result to json 13273 1726853299.75684: done dumping result, returning 13273 1726853299.75691: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' [02083763-bbaf-5fc3-657d-00000000006e] 13273 1726853299.75696: sending task result for task 02083763-bbaf-5fc3-657d-00000000006e 13273 1726853299.75783: done sending task result for task 02083763-bbaf-5fc3-657d-00000000006e 13273 1726853299.75786: WORKER PROCESS EXITING 13273 1726853299.75813: no more pending results, returning what we have 13273 1726853299.75817: in VariableManager get_vars() 13273 1726853299.75878: Calling all_inventory to load vars for managed_node3 13273 1726853299.75881: Calling groups_inventory to load vars for managed_node3 13273 1726853299.75884: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853299.75898: Calling all_plugins_play to load vars for managed_node3 13273 1726853299.75900: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853299.75903: Calling groups_plugins_play to load vars for managed_node3 13273 1726853299.76774: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853299.77608: done with get_vars() 13273 1726853299.77621: variable 'ansible_search_path' from source: unknown 13273 1726853299.77621: variable 'ansible_search_path' from source: unknown 13273 1726853299.77648: we have included files to process 13273 1726853299.77648: generating all_blocks data 13273 1726853299.77650: done generating all_blocks data 13273 1726853299.77653: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 13273 1726853299.77654: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 13273 1726853299.77655: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 13273 1726853299.77781: done processing included file 13273 1726853299.77782: iterating over new_blocks loaded from include file 13273 1726853299.77783: in VariableManager get_vars() 13273 1726853299.77800: done with get_vars() 13273 1726853299.77801: filtering new block on tags 13273 1726853299.77811: done filtering new block on tags 13273 1726853299.77813: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node3 13273 1726853299.77817: extending task lists for all hosts with included blocks 13273 1726853299.77878: done extending task lists 13273 1726853299.77879: done processing included files 13273 1726853299.77880: results queue empty 13273 1726853299.77880: checking for any_errors_fatal 13273 1726853299.77881: done checking for any_errors_fatal 13273 1726853299.77881: checking for max_fail_percentage 13273 1726853299.77882: done checking for max_fail_percentage 13273 1726853299.77882: checking to see if all hosts have failed and the running result is not ok 13273 1726853299.77883: done checking to see if all hosts have failed 13273 1726853299.77883: getting the remaining hosts for this loop 13273 1726853299.77884: done getting the remaining hosts for this loop 13273 1726853299.77886: getting the next task for host managed_node3 13273 1726853299.77888: done getting next task for host managed_node3 13273 1726853299.77890: ^ task is: TASK: Get stat for interface {{ interface }} 13273 1726853299.77892: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853299.77893: getting variables 13273 1726853299.77894: in VariableManager get_vars() 13273 1726853299.77906: Calling all_inventory to load vars for managed_node3 13273 1726853299.77907: Calling groups_inventory to load vars for managed_node3 13273 1726853299.77908: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853299.77912: Calling all_plugins_play to load vars for managed_node3 13273 1726853299.77913: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853299.77915: Calling groups_plugins_play to load vars for managed_node3 13273 1726853299.78533: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853299.79364: done with get_vars() 13273 1726853299.79379: done getting variables 13273 1726853299.79489: variable 'interface' from source: task vars 13273 1726853299.79492: variable 'controller_device' from source: play vars 13273 1726853299.79531: variable 'controller_device' from source: play vars TASK [Get stat for interface nm-bond] ****************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 13:28:19 -0400 (0:00:00.048) 0:00:17.684 ****** 13273 1726853299.79557: entering _queue_task() for managed_node3/stat 13273 1726853299.79794: worker is 1 (out of 1 available) 13273 1726853299.79806: exiting _queue_task() for managed_node3/stat 13273 1726853299.79818: done queuing things up, now waiting for results queue to drain 13273 1726853299.79820: waiting for pending results... 13273 1726853299.79993: running TaskExecutor() for managed_node3/TASK: Get stat for interface nm-bond 13273 1726853299.80063: in run() - task 02083763-bbaf-5fc3-657d-000000000337 13273 1726853299.80078: variable 'ansible_search_path' from source: unknown 13273 1726853299.80082: variable 'ansible_search_path' from source: unknown 13273 1726853299.80113: calling self._execute() 13273 1726853299.80184: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853299.80188: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853299.80197: variable 'omit' from source: magic vars 13273 1726853299.80457: variable 'ansible_distribution_major_version' from source: facts 13273 1726853299.80466: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853299.80473: variable 'omit' from source: magic vars 13273 1726853299.80514: variable 'omit' from source: magic vars 13273 1726853299.80577: variable 'interface' from source: task vars 13273 1726853299.80581: variable 'controller_device' from source: play vars 13273 1726853299.80629: variable 'controller_device' from source: play vars 13273 1726853299.80646: variable 'omit' from source: magic vars 13273 1726853299.80677: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13273 1726853299.80708: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13273 1726853299.80723: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13273 1726853299.80736: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853299.80747: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853299.80768: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13273 1726853299.80773: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853299.80776: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853299.80844: Set connection var ansible_connection to ssh 13273 1726853299.80851: Set connection var ansible_module_compression to ZIP_DEFLATED 13273 1726853299.80856: Set connection var ansible_shell_executable to /bin/sh 13273 1726853299.80859: Set connection var ansible_shell_type to sh 13273 1726853299.80864: Set connection var ansible_pipelining to False 13273 1726853299.80869: Set connection var ansible_timeout to 10 13273 1726853299.80889: variable 'ansible_shell_executable' from source: unknown 13273 1726853299.80892: variable 'ansible_connection' from source: unknown 13273 1726853299.80895: variable 'ansible_module_compression' from source: unknown 13273 1726853299.80897: variable 'ansible_shell_type' from source: unknown 13273 1726853299.80899: variable 'ansible_shell_executable' from source: unknown 13273 1726853299.80902: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853299.80904: variable 'ansible_pipelining' from source: unknown 13273 1726853299.80907: variable 'ansible_timeout' from source: unknown 13273 1726853299.80909: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853299.81050: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 13273 1726853299.81058: variable 'omit' from source: magic vars 13273 1726853299.81063: starting attempt loop 13273 1726853299.81066: running the handler 13273 1726853299.81080: _low_level_execute_command(): starting 13273 1726853299.81086: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13273 1726853299.81591: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853299.81595: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853299.81598: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 13273 1726853299.81601: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853299.81650: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853299.81654: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853299.81656: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853299.81731: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853299.83447: stdout chunk (state=3): >>>/root <<< 13273 1726853299.83548: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853299.83574: stderr chunk (state=3): >>><<< 13273 1726853299.83578: stdout chunk (state=3): >>><<< 13273 1726853299.83596: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853299.83608: _low_level_execute_command(): starting 13273 1726853299.83613: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853299.8359704-14196-259385355523714 `" && echo ansible-tmp-1726853299.8359704-14196-259385355523714="` echo /root/.ansible/tmp/ansible-tmp-1726853299.8359704-14196-259385355523714 `" ) && sleep 0' 13273 1726853299.84047: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853299.84051: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 13273 1726853299.84060: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853299.84063: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853299.84065: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853299.84110: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853299.84114: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853299.84175: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853299.86137: stdout chunk (state=3): >>>ansible-tmp-1726853299.8359704-14196-259385355523714=/root/.ansible/tmp/ansible-tmp-1726853299.8359704-14196-259385355523714 <<< 13273 1726853299.86249: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853299.86276: stderr chunk (state=3): >>><<< 13273 1726853299.86279: stdout chunk (state=3): >>><<< 13273 1726853299.86292: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853299.8359704-14196-259385355523714=/root/.ansible/tmp/ansible-tmp-1726853299.8359704-14196-259385355523714 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853299.86328: variable 'ansible_module_compression' from source: unknown 13273 1726853299.86376: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13273pukop8ph/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 13273 1726853299.86403: variable 'ansible_facts' from source: unknown 13273 1726853299.86468: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853299.8359704-14196-259385355523714/AnsiballZ_stat.py 13273 1726853299.86562: Sending initial data 13273 1726853299.86565: Sent initial data (153 bytes) 13273 1726853299.87010: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853299.87015: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 13273 1726853299.87017: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 13273 1726853299.87021: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 13273 1726853299.87023: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853299.87069: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853299.87074: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853299.87142: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853299.88782: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 13273 1726853299.88786: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13273 1726853299.88840: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13273 1726853299.88899: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13273pukop8ph/tmp78skkkpv /root/.ansible/tmp/ansible-tmp-1726853299.8359704-14196-259385355523714/AnsiballZ_stat.py <<< 13273 1726853299.88902: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853299.8359704-14196-259385355523714/AnsiballZ_stat.py" <<< 13273 1726853299.88954: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13273pukop8ph/tmp78skkkpv" to remote "/root/.ansible/tmp/ansible-tmp-1726853299.8359704-14196-259385355523714/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853299.8359704-14196-259385355523714/AnsiballZ_stat.py" <<< 13273 1726853299.89548: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853299.89586: stderr chunk (state=3): >>><<< 13273 1726853299.89590: stdout chunk (state=3): >>><<< 13273 1726853299.89634: done transferring module to remote 13273 1726853299.89641: _low_level_execute_command(): starting 13273 1726853299.89648: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853299.8359704-14196-259385355523714/ /root/.ansible/tmp/ansible-tmp-1726853299.8359704-14196-259385355523714/AnsiballZ_stat.py && sleep 0' 13273 1726853299.90075: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853299.90078: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853299.90080: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853299.90086: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853299.90127: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853299.90130: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853299.90197: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853299.92049: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853299.92073: stderr chunk (state=3): >>><<< 13273 1726853299.92076: stdout chunk (state=3): >>><<< 13273 1726853299.92090: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853299.92093: _low_level_execute_command(): starting 13273 1726853299.92097: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853299.8359704-14196-259385355523714/AnsiballZ_stat.py && sleep 0' 13273 1726853299.92515: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853299.92518: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 13273 1726853299.92520: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 13273 1726853299.92522: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853299.92524: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853299.92577: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853299.92580: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853299.92651: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853300.08484: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/nm-bond", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 29522, "dev": 23, "nlink": 1, "atime": 1726853298.7783458, "mtime": 1726853298.7783458, "ctime": 1726853298.7783458, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/nm-bond", "lnk_target": "../../devices/virtual/net/nm-bond", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/nm-bond", "follow": false, "checksum_algorithm": "sha1"}}} <<< 13273 1726853300.09829: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 13273 1726853300.09860: stderr chunk (state=3): >>><<< 13273 1726853300.09864: stdout chunk (state=3): >>><<< 13273 1726853300.09883: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/nm-bond", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 29522, "dev": 23, "nlink": 1, "atime": 1726853298.7783458, "mtime": 1726853298.7783458, "ctime": 1726853298.7783458, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/nm-bond", "lnk_target": "../../devices/virtual/net/nm-bond", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/nm-bond", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 13273 1726853300.09922: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/nm-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853299.8359704-14196-259385355523714/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13273 1726853300.09931: _low_level_execute_command(): starting 13273 1726853300.09934: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853299.8359704-14196-259385355523714/ > /dev/null 2>&1 && sleep 0' 13273 1726853300.10365: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853300.10369: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13273 1726853300.10401: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 13273 1726853300.10404: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration <<< 13273 1726853300.10406: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853300.10408: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853300.10465: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853300.10468: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853300.10470: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853300.10538: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853300.12606: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853300.12632: stderr chunk (state=3): >>><<< 13273 1726853300.12635: stdout chunk (state=3): >>><<< 13273 1726853300.12650: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853300.12657: handler run complete 13273 1726853300.12693: attempt loop complete, returning result 13273 1726853300.12696: _execute() done 13273 1726853300.12698: dumping result to json 13273 1726853300.12703: done dumping result, returning 13273 1726853300.12711: done running TaskExecutor() for managed_node3/TASK: Get stat for interface nm-bond [02083763-bbaf-5fc3-657d-000000000337] 13273 1726853300.12715: sending task result for task 02083763-bbaf-5fc3-657d-000000000337 13273 1726853300.12819: done sending task result for task 02083763-bbaf-5fc3-657d-000000000337 13273 1726853300.12821: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "stat": { "atime": 1726853298.7783458, "block_size": 4096, "blocks": 0, "ctime": 1726853298.7783458, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 29522, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/nm-bond", "lnk_target": "../../devices/virtual/net/nm-bond", "mode": "0777", "mtime": 1726853298.7783458, "nlink": 1, "path": "/sys/class/net/nm-bond", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 13273 1726853300.12906: no more pending results, returning what we have 13273 1726853300.12909: results queue empty 13273 1726853300.12910: checking for any_errors_fatal 13273 1726853300.12912: done checking for any_errors_fatal 13273 1726853300.12912: checking for max_fail_percentage 13273 1726853300.12914: done checking for max_fail_percentage 13273 1726853300.12915: checking to see if all hosts have failed and the running result is not ok 13273 1726853300.12915: done checking to see if all hosts have failed 13273 1726853300.12916: getting the remaining hosts for this loop 13273 1726853300.12917: done getting the remaining hosts for this loop 13273 1726853300.12920: getting the next task for host managed_node3 13273 1726853300.12931: done getting next task for host managed_node3 13273 1726853300.12933: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 13273 1726853300.12936: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853300.12940: getting variables 13273 1726853300.12941: in VariableManager get_vars() 13273 1726853300.12989: Calling all_inventory to load vars for managed_node3 13273 1726853300.12992: Calling groups_inventory to load vars for managed_node3 13273 1726853300.12994: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853300.13003: Calling all_plugins_play to load vars for managed_node3 13273 1726853300.13006: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853300.13008: Calling groups_plugins_play to load vars for managed_node3 13273 1726853300.13850: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853300.14693: done with get_vars() 13273 1726853300.14711: done getting variables 13273 1726853300.14753: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 13273 1726853300.14838: variable 'interface' from source: task vars 13273 1726853300.14841: variable 'controller_device' from source: play vars 13273 1726853300.14884: variable 'controller_device' from source: play vars TASK [Assert that the interface is present - 'nm-bond'] ************************ task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Friday 20 September 2024 13:28:20 -0400 (0:00:00.353) 0:00:18.038 ****** 13273 1726853300.14910: entering _queue_task() for managed_node3/assert 13273 1726853300.15141: worker is 1 (out of 1 available) 13273 1726853300.15154: exiting _queue_task() for managed_node3/assert 13273 1726853300.15167: done queuing things up, now waiting for results queue to drain 13273 1726853300.15169: waiting for pending results... 13273 1726853300.15348: running TaskExecutor() for managed_node3/TASK: Assert that the interface is present - 'nm-bond' 13273 1726853300.15433: in run() - task 02083763-bbaf-5fc3-657d-00000000006f 13273 1726853300.15444: variable 'ansible_search_path' from source: unknown 13273 1726853300.15448: variable 'ansible_search_path' from source: unknown 13273 1726853300.15480: calling self._execute() 13273 1726853300.15551: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853300.15554: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853300.15563: variable 'omit' from source: magic vars 13273 1726853300.15827: variable 'ansible_distribution_major_version' from source: facts 13273 1726853300.15838: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853300.15843: variable 'omit' from source: magic vars 13273 1726853300.15876: variable 'omit' from source: magic vars 13273 1726853300.15941: variable 'interface' from source: task vars 13273 1726853300.15945: variable 'controller_device' from source: play vars 13273 1726853300.15994: variable 'controller_device' from source: play vars 13273 1726853300.16007: variable 'omit' from source: magic vars 13273 1726853300.16039: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13273 1726853300.16069: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13273 1726853300.16086: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13273 1726853300.16100: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853300.16110: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853300.16133: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13273 1726853300.16136: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853300.16138: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853300.16209: Set connection var ansible_connection to ssh 13273 1726853300.16218: Set connection var ansible_module_compression to ZIP_DEFLATED 13273 1726853300.16223: Set connection var ansible_shell_executable to /bin/sh 13273 1726853300.16226: Set connection var ansible_shell_type to sh 13273 1726853300.16231: Set connection var ansible_pipelining to False 13273 1726853300.16236: Set connection var ansible_timeout to 10 13273 1726853300.16259: variable 'ansible_shell_executable' from source: unknown 13273 1726853300.16262: variable 'ansible_connection' from source: unknown 13273 1726853300.16264: variable 'ansible_module_compression' from source: unknown 13273 1726853300.16267: variable 'ansible_shell_type' from source: unknown 13273 1726853300.16269: variable 'ansible_shell_executable' from source: unknown 13273 1726853300.16273: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853300.16276: variable 'ansible_pipelining' from source: unknown 13273 1726853300.16278: variable 'ansible_timeout' from source: unknown 13273 1726853300.16288: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853300.16382: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13273 1726853300.16392: variable 'omit' from source: magic vars 13273 1726853300.16395: starting attempt loop 13273 1726853300.16398: running the handler 13273 1726853300.16485: variable 'interface_stat' from source: set_fact 13273 1726853300.16501: Evaluated conditional (interface_stat.stat.exists): True 13273 1726853300.16504: handler run complete 13273 1726853300.16517: attempt loop complete, returning result 13273 1726853300.16520: _execute() done 13273 1726853300.16523: dumping result to json 13273 1726853300.16526: done dumping result, returning 13273 1726853300.16531: done running TaskExecutor() for managed_node3/TASK: Assert that the interface is present - 'nm-bond' [02083763-bbaf-5fc3-657d-00000000006f] 13273 1726853300.16536: sending task result for task 02083763-bbaf-5fc3-657d-00000000006f 13273 1726853300.16614: done sending task result for task 02083763-bbaf-5fc3-657d-00000000006f 13273 1726853300.16619: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 13273 1726853300.16663: no more pending results, returning what we have 13273 1726853300.16667: results queue empty 13273 1726853300.16667: checking for any_errors_fatal 13273 1726853300.16676: done checking for any_errors_fatal 13273 1726853300.16677: checking for max_fail_percentage 13273 1726853300.16679: done checking for max_fail_percentage 13273 1726853300.16680: checking to see if all hosts have failed and the running result is not ok 13273 1726853300.16680: done checking to see if all hosts have failed 13273 1726853300.16681: getting the remaining hosts for this loop 13273 1726853300.16682: done getting the remaining hosts for this loop 13273 1726853300.16686: getting the next task for host managed_node3 13273 1726853300.16694: done getting next task for host managed_node3 13273 1726853300.16697: ^ task is: TASK: Include the task 'assert_profile_present.yml' 13273 1726853300.16699: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853300.16703: getting variables 13273 1726853300.16704: in VariableManager get_vars() 13273 1726853300.16748: Calling all_inventory to load vars for managed_node3 13273 1726853300.16751: Calling groups_inventory to load vars for managed_node3 13273 1726853300.16753: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853300.16761: Calling all_plugins_play to load vars for managed_node3 13273 1726853300.16763: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853300.16766: Calling groups_plugins_play to load vars for managed_node3 13273 1726853300.17512: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853300.18433: done with get_vars() 13273 1726853300.18447: done getting variables TASK [Include the task 'assert_profile_present.yml'] *************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:67 Friday 20 September 2024 13:28:20 -0400 (0:00:00.035) 0:00:18.074 ****** 13273 1726853300.18508: entering _queue_task() for managed_node3/include_tasks 13273 1726853300.18708: worker is 1 (out of 1 available) 13273 1726853300.18720: exiting _queue_task() for managed_node3/include_tasks 13273 1726853300.18732: done queuing things up, now waiting for results queue to drain 13273 1726853300.18733: waiting for pending results... 13273 1726853300.18904: running TaskExecutor() for managed_node3/TASK: Include the task 'assert_profile_present.yml' 13273 1726853300.18962: in run() - task 02083763-bbaf-5fc3-657d-000000000070 13273 1726853300.18969: variable 'ansible_search_path' from source: unknown 13273 1726853300.19007: variable 'controller_profile' from source: play vars 13273 1726853300.19138: variable 'controller_profile' from source: play vars 13273 1726853300.19152: variable 'port1_profile' from source: play vars 13273 1726853300.19202: variable 'port1_profile' from source: play vars 13273 1726853300.19207: variable 'port2_profile' from source: play vars 13273 1726853300.19253: variable 'port2_profile' from source: play vars 13273 1726853300.19262: variable 'omit' from source: magic vars 13273 1726853300.19364: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853300.19372: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853300.19382: variable 'omit' from source: magic vars 13273 1726853300.19549: variable 'ansible_distribution_major_version' from source: facts 13273 1726853300.19556: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853300.19580: variable 'item' from source: unknown 13273 1726853300.19625: variable 'item' from source: unknown 13273 1726853300.19734: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853300.19738: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853300.19741: variable 'omit' from source: magic vars 13273 1726853300.19827: variable 'ansible_distribution_major_version' from source: facts 13273 1726853300.19830: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853300.19853: variable 'item' from source: unknown 13273 1726853300.19897: variable 'item' from source: unknown 13273 1726853300.19958: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853300.19974: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853300.19980: variable 'omit' from source: magic vars 13273 1726853300.20070: variable 'ansible_distribution_major_version' from source: facts 13273 1726853300.20076: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853300.20102: variable 'item' from source: unknown 13273 1726853300.20144: variable 'item' from source: unknown 13273 1726853300.20201: dumping result to json 13273 1726853300.20204: done dumping result, returning 13273 1726853300.20207: done running TaskExecutor() for managed_node3/TASK: Include the task 'assert_profile_present.yml' [02083763-bbaf-5fc3-657d-000000000070] 13273 1726853300.20209: sending task result for task 02083763-bbaf-5fc3-657d-000000000070 13273 1726853300.20246: done sending task result for task 02083763-bbaf-5fc3-657d-000000000070 13273 1726853300.20249: WORKER PROCESS EXITING 13273 1726853300.20276: no more pending results, returning what we have 13273 1726853300.20281: in VariableManager get_vars() 13273 1726853300.20334: Calling all_inventory to load vars for managed_node3 13273 1726853300.20337: Calling groups_inventory to load vars for managed_node3 13273 1726853300.20340: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853300.20352: Calling all_plugins_play to load vars for managed_node3 13273 1726853300.20354: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853300.20356: Calling groups_plugins_play to load vars for managed_node3 13273 1726853300.24036: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853300.24868: done with get_vars() 13273 1726853300.24882: variable 'ansible_search_path' from source: unknown 13273 1726853300.24892: variable 'ansible_search_path' from source: unknown 13273 1726853300.24897: variable 'ansible_search_path' from source: unknown 13273 1726853300.24901: we have included files to process 13273 1726853300.24902: generating all_blocks data 13273 1726853300.24903: done generating all_blocks data 13273 1726853300.24904: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 13273 1726853300.24905: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 13273 1726853300.24906: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 13273 1726853300.25014: in VariableManager get_vars() 13273 1726853300.25034: done with get_vars() 13273 1726853300.25197: done processing included file 13273 1726853300.25199: iterating over new_blocks loaded from include file 13273 1726853300.25200: in VariableManager get_vars() 13273 1726853300.25214: done with get_vars() 13273 1726853300.25215: filtering new block on tags 13273 1726853300.25226: done filtering new block on tags 13273 1726853300.25228: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed_node3 => (item=bond0) 13273 1726853300.25231: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 13273 1726853300.25231: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 13273 1726853300.25233: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 13273 1726853300.25295: in VariableManager get_vars() 13273 1726853300.25312: done with get_vars() 13273 1726853300.25456: done processing included file 13273 1726853300.25458: iterating over new_blocks loaded from include file 13273 1726853300.25458: in VariableManager get_vars() 13273 1726853300.25474: done with get_vars() 13273 1726853300.25476: filtering new block on tags 13273 1726853300.25486: done filtering new block on tags 13273 1726853300.25487: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed_node3 => (item=bond0.0) 13273 1726853300.25490: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 13273 1726853300.25491: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 13273 1726853300.25493: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 13273 1726853300.25586: in VariableManager get_vars() 13273 1726853300.25605: done with get_vars() 13273 1726853300.25750: done processing included file 13273 1726853300.25751: iterating over new_blocks loaded from include file 13273 1726853300.25752: in VariableManager get_vars() 13273 1726853300.25766: done with get_vars() 13273 1726853300.25767: filtering new block on tags 13273 1726853300.25780: done filtering new block on tags 13273 1726853300.25781: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed_node3 => (item=bond0.1) 13273 1726853300.25783: extending task lists for all hosts with included blocks 13273 1726853300.28745: done extending task lists 13273 1726853300.28751: done processing included files 13273 1726853300.28752: results queue empty 13273 1726853300.28752: checking for any_errors_fatal 13273 1726853300.28755: done checking for any_errors_fatal 13273 1726853300.28756: checking for max_fail_percentage 13273 1726853300.28757: done checking for max_fail_percentage 13273 1726853300.28757: checking to see if all hosts have failed and the running result is not ok 13273 1726853300.28758: done checking to see if all hosts have failed 13273 1726853300.28758: getting the remaining hosts for this loop 13273 1726853300.28759: done getting the remaining hosts for this loop 13273 1726853300.28760: getting the next task for host managed_node3 13273 1726853300.28763: done getting next task for host managed_node3 13273 1726853300.28764: ^ task is: TASK: Include the task 'get_profile_stat.yml' 13273 1726853300.28765: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853300.28767: getting variables 13273 1726853300.28768: in VariableManager get_vars() 13273 1726853300.28781: Calling all_inventory to load vars for managed_node3 13273 1726853300.28783: Calling groups_inventory to load vars for managed_node3 13273 1726853300.28784: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853300.28788: Calling all_plugins_play to load vars for managed_node3 13273 1726853300.28789: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853300.28791: Calling groups_plugins_play to load vars for managed_node3 13273 1726853300.29454: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853300.30302: done with get_vars() 13273 1726853300.30315: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Friday 20 September 2024 13:28:20 -0400 (0:00:00.118) 0:00:18.192 ****** 13273 1726853300.30362: entering _queue_task() for managed_node3/include_tasks 13273 1726853300.30636: worker is 1 (out of 1 available) 13273 1726853300.30651: exiting _queue_task() for managed_node3/include_tasks 13273 1726853300.30663: done queuing things up, now waiting for results queue to drain 13273 1726853300.30664: waiting for pending results... 13273 1726853300.30838: running TaskExecutor() for managed_node3/TASK: Include the task 'get_profile_stat.yml' 13273 1726853300.30903: in run() - task 02083763-bbaf-5fc3-657d-000000000355 13273 1726853300.30914: variable 'ansible_search_path' from source: unknown 13273 1726853300.30917: variable 'ansible_search_path' from source: unknown 13273 1726853300.30949: calling self._execute() 13273 1726853300.31023: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853300.31029: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853300.31038: variable 'omit' from source: magic vars 13273 1726853300.31324: variable 'ansible_distribution_major_version' from source: facts 13273 1726853300.31333: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853300.31337: _execute() done 13273 1726853300.31340: dumping result to json 13273 1726853300.31343: done dumping result, returning 13273 1726853300.31351: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_profile_stat.yml' [02083763-bbaf-5fc3-657d-000000000355] 13273 1726853300.31356: sending task result for task 02083763-bbaf-5fc3-657d-000000000355 13273 1726853300.31436: done sending task result for task 02083763-bbaf-5fc3-657d-000000000355 13273 1726853300.31438: WORKER PROCESS EXITING 13273 1726853300.31465: no more pending results, returning what we have 13273 1726853300.31469: in VariableManager get_vars() 13273 1726853300.31524: Calling all_inventory to load vars for managed_node3 13273 1726853300.31527: Calling groups_inventory to load vars for managed_node3 13273 1726853300.31530: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853300.31541: Calling all_plugins_play to load vars for managed_node3 13273 1726853300.31544: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853300.31547: Calling groups_plugins_play to load vars for managed_node3 13273 1726853300.32307: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853300.33261: done with get_vars() 13273 1726853300.33275: variable 'ansible_search_path' from source: unknown 13273 1726853300.33276: variable 'ansible_search_path' from source: unknown 13273 1726853300.33301: we have included files to process 13273 1726853300.33302: generating all_blocks data 13273 1726853300.33303: done generating all_blocks data 13273 1726853300.33304: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 13273 1726853300.33305: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 13273 1726853300.33307: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 13273 1726853300.33947: done processing included file 13273 1726853300.33949: iterating over new_blocks loaded from include file 13273 1726853300.33950: in VariableManager get_vars() 13273 1726853300.33969: done with get_vars() 13273 1726853300.33970: filtering new block on tags 13273 1726853300.33985: done filtering new block on tags 13273 1726853300.33987: in VariableManager get_vars() 13273 1726853300.34002: done with get_vars() 13273 1726853300.34003: filtering new block on tags 13273 1726853300.34014: done filtering new block on tags 13273 1726853300.34016: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node3 13273 1726853300.34019: extending task lists for all hosts with included blocks 13273 1726853300.34116: done extending task lists 13273 1726853300.34118: done processing included files 13273 1726853300.34118: results queue empty 13273 1726853300.34119: checking for any_errors_fatal 13273 1726853300.34121: done checking for any_errors_fatal 13273 1726853300.34121: checking for max_fail_percentage 13273 1726853300.34122: done checking for max_fail_percentage 13273 1726853300.34122: checking to see if all hosts have failed and the running result is not ok 13273 1726853300.34123: done checking to see if all hosts have failed 13273 1726853300.34123: getting the remaining hosts for this loop 13273 1726853300.34124: done getting the remaining hosts for this loop 13273 1726853300.34126: getting the next task for host managed_node3 13273 1726853300.34128: done getting next task for host managed_node3 13273 1726853300.34130: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 13273 1726853300.34132: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853300.34133: getting variables 13273 1726853300.34134: in VariableManager get_vars() 13273 1726853300.34177: Calling all_inventory to load vars for managed_node3 13273 1726853300.34179: Calling groups_inventory to load vars for managed_node3 13273 1726853300.34180: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853300.34185: Calling all_plugins_play to load vars for managed_node3 13273 1726853300.34187: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853300.34189: Calling groups_plugins_play to load vars for managed_node3 13273 1726853300.34777: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853300.35604: done with get_vars() 13273 1726853300.35617: done getting variables 13273 1726853300.35644: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Friday 20 September 2024 13:28:20 -0400 (0:00:00.052) 0:00:18.245 ****** 13273 1726853300.35663: entering _queue_task() for managed_node3/set_fact 13273 1726853300.35910: worker is 1 (out of 1 available) 13273 1726853300.35921: exiting _queue_task() for managed_node3/set_fact 13273 1726853300.35934: done queuing things up, now waiting for results queue to drain 13273 1726853300.35935: waiting for pending results... 13273 1726853300.36111: running TaskExecutor() for managed_node3/TASK: Initialize NM profile exist and ansible_managed comment flag 13273 1726853300.36184: in run() - task 02083763-bbaf-5fc3-657d-0000000005e4 13273 1726853300.36197: variable 'ansible_search_path' from source: unknown 13273 1726853300.36201: variable 'ansible_search_path' from source: unknown 13273 1726853300.36228: calling self._execute() 13273 1726853300.36297: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853300.36302: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853300.36311: variable 'omit' from source: magic vars 13273 1726853300.36678: variable 'ansible_distribution_major_version' from source: facts 13273 1726853300.36688: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853300.36694: variable 'omit' from source: magic vars 13273 1726853300.36729: variable 'omit' from source: magic vars 13273 1726853300.36757: variable 'omit' from source: magic vars 13273 1726853300.36788: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13273 1726853300.36814: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13273 1726853300.36831: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13273 1726853300.36848: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853300.36858: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853300.36882: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13273 1726853300.36886: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853300.36888: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853300.36956: Set connection var ansible_connection to ssh 13273 1726853300.36964: Set connection var ansible_module_compression to ZIP_DEFLATED 13273 1726853300.36969: Set connection var ansible_shell_executable to /bin/sh 13273 1726853300.36974: Set connection var ansible_shell_type to sh 13273 1726853300.36979: Set connection var ansible_pipelining to False 13273 1726853300.36984: Set connection var ansible_timeout to 10 13273 1726853300.37004: variable 'ansible_shell_executable' from source: unknown 13273 1726853300.37007: variable 'ansible_connection' from source: unknown 13273 1726853300.37010: variable 'ansible_module_compression' from source: unknown 13273 1726853300.37013: variable 'ansible_shell_type' from source: unknown 13273 1726853300.37015: variable 'ansible_shell_executable' from source: unknown 13273 1726853300.37018: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853300.37020: variable 'ansible_pipelining' from source: unknown 13273 1726853300.37023: variable 'ansible_timeout' from source: unknown 13273 1726853300.37028: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853300.37129: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13273 1726853300.37137: variable 'omit' from source: magic vars 13273 1726853300.37144: starting attempt loop 13273 1726853300.37147: running the handler 13273 1726853300.37160: handler run complete 13273 1726853300.37168: attempt loop complete, returning result 13273 1726853300.37172: _execute() done 13273 1726853300.37175: dumping result to json 13273 1726853300.37177: done dumping result, returning 13273 1726853300.37184: done running TaskExecutor() for managed_node3/TASK: Initialize NM profile exist and ansible_managed comment flag [02083763-bbaf-5fc3-657d-0000000005e4] 13273 1726853300.37188: sending task result for task 02083763-bbaf-5fc3-657d-0000000005e4 13273 1726853300.37263: done sending task result for task 02083763-bbaf-5fc3-657d-0000000005e4 13273 1726853300.37266: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 13273 1726853300.37318: no more pending results, returning what we have 13273 1726853300.37321: results queue empty 13273 1726853300.37322: checking for any_errors_fatal 13273 1726853300.37324: done checking for any_errors_fatal 13273 1726853300.37325: checking for max_fail_percentage 13273 1726853300.37326: done checking for max_fail_percentage 13273 1726853300.37327: checking to see if all hosts have failed and the running result is not ok 13273 1726853300.37327: done checking to see if all hosts have failed 13273 1726853300.37328: getting the remaining hosts for this loop 13273 1726853300.37329: done getting the remaining hosts for this loop 13273 1726853300.37332: getting the next task for host managed_node3 13273 1726853300.37338: done getting next task for host managed_node3 13273 1726853300.37340: ^ task is: TASK: Stat profile file 13273 1726853300.37344: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853300.37348: getting variables 13273 1726853300.37350: in VariableManager get_vars() 13273 1726853300.37399: Calling all_inventory to load vars for managed_node3 13273 1726853300.37402: Calling groups_inventory to load vars for managed_node3 13273 1726853300.37405: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853300.37413: Calling all_plugins_play to load vars for managed_node3 13273 1726853300.37415: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853300.37417: Calling groups_plugins_play to load vars for managed_node3 13273 1726853300.38337: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853300.39934: done with get_vars() 13273 1726853300.39963: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Friday 20 September 2024 13:28:20 -0400 (0:00:00.043) 0:00:18.289 ****** 13273 1726853300.40027: entering _queue_task() for managed_node3/stat 13273 1726853300.40259: worker is 1 (out of 1 available) 13273 1726853300.40269: exiting _queue_task() for managed_node3/stat 13273 1726853300.40283: done queuing things up, now waiting for results queue to drain 13273 1726853300.40285: waiting for pending results... 13273 1726853300.40456: running TaskExecutor() for managed_node3/TASK: Stat profile file 13273 1726853300.40518: in run() - task 02083763-bbaf-5fc3-657d-0000000005e5 13273 1726853300.40529: variable 'ansible_search_path' from source: unknown 13273 1726853300.40533: variable 'ansible_search_path' from source: unknown 13273 1726853300.40562: calling self._execute() 13273 1726853300.40634: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853300.40640: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853300.40651: variable 'omit' from source: magic vars 13273 1726853300.40920: variable 'ansible_distribution_major_version' from source: facts 13273 1726853300.40929: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853300.40935: variable 'omit' from source: magic vars 13273 1726853300.40973: variable 'omit' from source: magic vars 13273 1726853300.41037: variable 'profile' from source: include params 13273 1726853300.41041: variable 'item' from source: include params 13273 1726853300.41094: variable 'item' from source: include params 13273 1726853300.41108: variable 'omit' from source: magic vars 13273 1726853300.41142: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13273 1726853300.41173: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13273 1726853300.41188: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13273 1726853300.41201: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853300.41210: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853300.41233: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13273 1726853300.41236: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853300.41238: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853300.41309: Set connection var ansible_connection to ssh 13273 1726853300.41317: Set connection var ansible_module_compression to ZIP_DEFLATED 13273 1726853300.41322: Set connection var ansible_shell_executable to /bin/sh 13273 1726853300.41325: Set connection var ansible_shell_type to sh 13273 1726853300.41330: Set connection var ansible_pipelining to False 13273 1726853300.41335: Set connection var ansible_timeout to 10 13273 1726853300.41359: variable 'ansible_shell_executable' from source: unknown 13273 1726853300.41363: variable 'ansible_connection' from source: unknown 13273 1726853300.41365: variable 'ansible_module_compression' from source: unknown 13273 1726853300.41368: variable 'ansible_shell_type' from source: unknown 13273 1726853300.41370: variable 'ansible_shell_executable' from source: unknown 13273 1726853300.41374: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853300.41376: variable 'ansible_pipelining' from source: unknown 13273 1726853300.41379: variable 'ansible_timeout' from source: unknown 13273 1726853300.41383: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853300.41525: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 13273 1726853300.41533: variable 'omit' from source: magic vars 13273 1726853300.41539: starting attempt loop 13273 1726853300.41543: running the handler 13273 1726853300.41557: _low_level_execute_command(): starting 13273 1726853300.41563: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13273 1726853300.42195: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853300.42199: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853300.42202: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853300.42204: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853300.42217: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853300.42267: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853300.42336: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853300.44053: stdout chunk (state=3): >>>/root <<< 13273 1726853300.44151: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853300.44198: stderr chunk (state=3): >>><<< 13273 1726853300.44201: stdout chunk (state=3): >>><<< 13273 1726853300.44223: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853300.44316: _low_level_execute_command(): starting 13273 1726853300.44320: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853300.4423125-14209-179731854454538 `" && echo ansible-tmp-1726853300.4423125-14209-179731854454538="` echo /root/.ansible/tmp/ansible-tmp-1726853300.4423125-14209-179731854454538 `" ) && sleep 0' 13273 1726853300.44822: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13273 1726853300.44836: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853300.44851: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853300.44867: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13273 1726853300.44895: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 13273 1726853300.44994: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853300.45018: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853300.45099: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853300.47072: stdout chunk (state=3): >>>ansible-tmp-1726853300.4423125-14209-179731854454538=/root/.ansible/tmp/ansible-tmp-1726853300.4423125-14209-179731854454538 <<< 13273 1726853300.47207: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853300.47220: stdout chunk (state=3): >>><<< 13273 1726853300.47282: stderr chunk (state=3): >>><<< 13273 1726853300.47307: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853300.4423125-14209-179731854454538=/root/.ansible/tmp/ansible-tmp-1726853300.4423125-14209-179731854454538 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853300.47361: variable 'ansible_module_compression' from source: unknown 13273 1726853300.47435: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13273pukop8ph/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 13273 1726853300.47503: variable 'ansible_facts' from source: unknown 13273 1726853300.47613: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853300.4423125-14209-179731854454538/AnsiballZ_stat.py 13273 1726853300.47892: Sending initial data 13273 1726853300.47895: Sent initial data (153 bytes) 13273 1726853300.48393: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853300.48409: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 13273 1726853300.48424: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853300.48473: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853300.48492: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853300.48548: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853300.50179: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13273 1726853300.50318: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13273 1726853300.50368: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13273pukop8ph/tmpx8h27rg0 /root/.ansible/tmp/ansible-tmp-1726853300.4423125-14209-179731854454538/AnsiballZ_stat.py <<< 13273 1726853300.50377: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853300.4423125-14209-179731854454538/AnsiballZ_stat.py" <<< 13273 1726853300.50477: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13273pukop8ph/tmpx8h27rg0" to remote "/root/.ansible/tmp/ansible-tmp-1726853300.4423125-14209-179731854454538/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853300.4423125-14209-179731854454538/AnsiballZ_stat.py" <<< 13273 1726853300.52010: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853300.52013: stderr chunk (state=3): >>><<< 13273 1726853300.52016: stdout chunk (state=3): >>><<< 13273 1726853300.52030: done transferring module to remote 13273 1726853300.52050: _low_level_execute_command(): starting 13273 1726853300.52062: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853300.4423125-14209-179731854454538/ /root/.ansible/tmp/ansible-tmp-1726853300.4423125-14209-179731854454538/AnsiballZ_stat.py && sleep 0' 13273 1726853300.52755: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13273 1726853300.52775: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853300.52790: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853300.52813: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13273 1726853300.52832: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 13273 1726853300.52890: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853300.52949: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853300.52969: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853300.53000: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853300.53099: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853300.54979: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853300.55007: stderr chunk (state=3): >>><<< 13273 1726853300.55010: stdout chunk (state=3): >>><<< 13273 1726853300.55041: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853300.55047: _low_level_execute_command(): starting 13273 1726853300.55049: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853300.4423125-14209-179731854454538/AnsiballZ_stat.py && sleep 0' 13273 1726853300.55476: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853300.55480: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13273 1726853300.55482: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853300.55484: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853300.55486: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853300.55542: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853300.55545: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853300.55604: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853300.70965: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 13273 1726853300.72313: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 13273 1726853300.72348: stderr chunk (state=3): >>><<< 13273 1726853300.72352: stdout chunk (state=3): >>><<< 13273 1726853300.72364: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 13273 1726853300.72392: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-bond0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853300.4423125-14209-179731854454538/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13273 1726853300.72399: _low_level_execute_command(): starting 13273 1726853300.72404: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853300.4423125-14209-179731854454538/ > /dev/null 2>&1 && sleep 0' 13273 1726853300.72864: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853300.72867: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 13273 1726853300.72869: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853300.72873: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853300.72875: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 13273 1726853300.72877: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853300.72927: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853300.72934: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853300.72995: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853300.74858: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853300.74884: stderr chunk (state=3): >>><<< 13273 1726853300.74887: stdout chunk (state=3): >>><<< 13273 1726853300.74899: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853300.74905: handler run complete 13273 1726853300.74925: attempt loop complete, returning result 13273 1726853300.74928: _execute() done 13273 1726853300.74931: dumping result to json 13273 1726853300.74933: done dumping result, returning 13273 1726853300.74941: done running TaskExecutor() for managed_node3/TASK: Stat profile file [02083763-bbaf-5fc3-657d-0000000005e5] 13273 1726853300.74947: sending task result for task 02083763-bbaf-5fc3-657d-0000000005e5 13273 1726853300.75039: done sending task result for task 02083763-bbaf-5fc3-657d-0000000005e5 13273 1726853300.75041: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "stat": { "exists": false } } 13273 1726853300.75097: no more pending results, returning what we have 13273 1726853300.75100: results queue empty 13273 1726853300.75101: checking for any_errors_fatal 13273 1726853300.75107: done checking for any_errors_fatal 13273 1726853300.75108: checking for max_fail_percentage 13273 1726853300.75110: done checking for max_fail_percentage 13273 1726853300.75110: checking to see if all hosts have failed and the running result is not ok 13273 1726853300.75111: done checking to see if all hosts have failed 13273 1726853300.75111: getting the remaining hosts for this loop 13273 1726853300.75113: done getting the remaining hosts for this loop 13273 1726853300.75116: getting the next task for host managed_node3 13273 1726853300.75122: done getting next task for host managed_node3 13273 1726853300.75124: ^ task is: TASK: Set NM profile exist flag based on the profile files 13273 1726853300.75129: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853300.75133: getting variables 13273 1726853300.75134: in VariableManager get_vars() 13273 1726853300.75188: Calling all_inventory to load vars for managed_node3 13273 1726853300.75191: Calling groups_inventory to load vars for managed_node3 13273 1726853300.75194: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853300.75204: Calling all_plugins_play to load vars for managed_node3 13273 1726853300.75206: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853300.75209: Calling groups_plugins_play to load vars for managed_node3 13273 1726853300.75998: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853300.76850: done with get_vars() 13273 1726853300.76865: done getting variables 13273 1726853300.76914: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Friday 20 September 2024 13:28:20 -0400 (0:00:00.369) 0:00:18.658 ****** 13273 1726853300.76940: entering _queue_task() for managed_node3/set_fact 13273 1726853300.77157: worker is 1 (out of 1 available) 13273 1726853300.77172: exiting _queue_task() for managed_node3/set_fact 13273 1726853300.77186: done queuing things up, now waiting for results queue to drain 13273 1726853300.77187: waiting for pending results... 13273 1726853300.77356: running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag based on the profile files 13273 1726853300.77430: in run() - task 02083763-bbaf-5fc3-657d-0000000005e6 13273 1726853300.77442: variable 'ansible_search_path' from source: unknown 13273 1726853300.77446: variable 'ansible_search_path' from source: unknown 13273 1726853300.77475: calling self._execute() 13273 1726853300.77545: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853300.77553: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853300.77562: variable 'omit' from source: magic vars 13273 1726853300.77838: variable 'ansible_distribution_major_version' from source: facts 13273 1726853300.77850: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853300.77935: variable 'profile_stat' from source: set_fact 13273 1726853300.77947: Evaluated conditional (profile_stat.stat.exists): False 13273 1726853300.77951: when evaluation is False, skipping this task 13273 1726853300.77954: _execute() done 13273 1726853300.77956: dumping result to json 13273 1726853300.77959: done dumping result, returning 13273 1726853300.77974: done running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag based on the profile files [02083763-bbaf-5fc3-657d-0000000005e6] 13273 1726853300.77977: sending task result for task 02083763-bbaf-5fc3-657d-0000000005e6 13273 1726853300.78050: done sending task result for task 02083763-bbaf-5fc3-657d-0000000005e6 13273 1726853300.78053: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 13273 1726853300.78114: no more pending results, returning what we have 13273 1726853300.78118: results queue empty 13273 1726853300.78119: checking for any_errors_fatal 13273 1726853300.78126: done checking for any_errors_fatal 13273 1726853300.78126: checking for max_fail_percentage 13273 1726853300.78128: done checking for max_fail_percentage 13273 1726853300.78128: checking to see if all hosts have failed and the running result is not ok 13273 1726853300.78129: done checking to see if all hosts have failed 13273 1726853300.78130: getting the remaining hosts for this loop 13273 1726853300.78131: done getting the remaining hosts for this loop 13273 1726853300.78134: getting the next task for host managed_node3 13273 1726853300.78140: done getting next task for host managed_node3 13273 1726853300.78141: ^ task is: TASK: Get NM profile info 13273 1726853300.78148: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853300.78151: getting variables 13273 1726853300.78152: in VariableManager get_vars() 13273 1726853300.78195: Calling all_inventory to load vars for managed_node3 13273 1726853300.78197: Calling groups_inventory to load vars for managed_node3 13273 1726853300.78199: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853300.78208: Calling all_plugins_play to load vars for managed_node3 13273 1726853300.78210: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853300.78212: Calling groups_plugins_play to load vars for managed_node3 13273 1726853300.79042: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853300.79888: done with get_vars() 13273 1726853300.79902: done getting variables 13273 1726853300.79945: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Friday 20 September 2024 13:28:20 -0400 (0:00:00.030) 0:00:18.688 ****** 13273 1726853300.79965: entering _queue_task() for managed_node3/shell 13273 1726853300.80164: worker is 1 (out of 1 available) 13273 1726853300.80178: exiting _queue_task() for managed_node3/shell 13273 1726853300.80190: done queuing things up, now waiting for results queue to drain 13273 1726853300.80191: waiting for pending results... 13273 1726853300.80362: running TaskExecutor() for managed_node3/TASK: Get NM profile info 13273 1726853300.80428: in run() - task 02083763-bbaf-5fc3-657d-0000000005e7 13273 1726853300.80439: variable 'ansible_search_path' from source: unknown 13273 1726853300.80443: variable 'ansible_search_path' from source: unknown 13273 1726853300.80474: calling self._execute() 13273 1726853300.80542: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853300.80550: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853300.80559: variable 'omit' from source: magic vars 13273 1726853300.80829: variable 'ansible_distribution_major_version' from source: facts 13273 1726853300.80838: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853300.80843: variable 'omit' from source: magic vars 13273 1726853300.80882: variable 'omit' from source: magic vars 13273 1726853300.80949: variable 'profile' from source: include params 13273 1726853300.80953: variable 'item' from source: include params 13273 1726853300.81003: variable 'item' from source: include params 13273 1726853300.81018: variable 'omit' from source: magic vars 13273 1726853300.81051: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13273 1726853300.81081: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13273 1726853300.81095: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13273 1726853300.81110: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853300.81120: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853300.81143: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13273 1726853300.81149: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853300.81152: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853300.81221: Set connection var ansible_connection to ssh 13273 1726853300.81229: Set connection var ansible_module_compression to ZIP_DEFLATED 13273 1726853300.81234: Set connection var ansible_shell_executable to /bin/sh 13273 1726853300.81237: Set connection var ansible_shell_type to sh 13273 1726853300.81242: Set connection var ansible_pipelining to False 13273 1726853300.81249: Set connection var ansible_timeout to 10 13273 1726853300.81268: variable 'ansible_shell_executable' from source: unknown 13273 1726853300.81273: variable 'ansible_connection' from source: unknown 13273 1726853300.81276: variable 'ansible_module_compression' from source: unknown 13273 1726853300.81278: variable 'ansible_shell_type' from source: unknown 13273 1726853300.81288: variable 'ansible_shell_executable' from source: unknown 13273 1726853300.81290: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853300.81292: variable 'ansible_pipelining' from source: unknown 13273 1726853300.81295: variable 'ansible_timeout' from source: unknown 13273 1726853300.81297: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853300.81398: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13273 1726853300.81401: variable 'omit' from source: magic vars 13273 1726853300.81408: starting attempt loop 13273 1726853300.81411: running the handler 13273 1726853300.81420: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13273 1726853300.81434: _low_level_execute_command(): starting 13273 1726853300.81441: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13273 1726853300.81947: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853300.81951: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853300.81956: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853300.81958: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853300.82009: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853300.82012: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853300.82014: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853300.82092: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853300.83804: stdout chunk (state=3): >>>/root <<< 13273 1726853300.83898: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853300.83928: stderr chunk (state=3): >>><<< 13273 1726853300.83931: stdout chunk (state=3): >>><<< 13273 1726853300.83953: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853300.83964: _low_level_execute_command(): starting 13273 1726853300.83970: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853300.839523-14232-149410026161085 `" && echo ansible-tmp-1726853300.839523-14232-149410026161085="` echo /root/.ansible/tmp/ansible-tmp-1726853300.839523-14232-149410026161085 `" ) && sleep 0' 13273 1726853300.84420: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853300.84425: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 13273 1726853300.84428: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 13273 1726853300.84430: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13273 1726853300.84432: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853300.84482: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853300.84485: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853300.84556: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853300.86493: stdout chunk (state=3): >>>ansible-tmp-1726853300.839523-14232-149410026161085=/root/.ansible/tmp/ansible-tmp-1726853300.839523-14232-149410026161085 <<< 13273 1726853300.86605: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853300.86628: stderr chunk (state=3): >>><<< 13273 1726853300.86631: stdout chunk (state=3): >>><<< 13273 1726853300.86648: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853300.839523-14232-149410026161085=/root/.ansible/tmp/ansible-tmp-1726853300.839523-14232-149410026161085 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853300.86674: variable 'ansible_module_compression' from source: unknown 13273 1726853300.86717: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13273pukop8ph/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 13273 1726853300.86749: variable 'ansible_facts' from source: unknown 13273 1726853300.86808: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853300.839523-14232-149410026161085/AnsiballZ_command.py 13273 1726853300.86903: Sending initial data 13273 1726853300.86907: Sent initial data (155 bytes) 13273 1726853300.87341: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853300.87346: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 13273 1726853300.87349: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853300.87351: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853300.87353: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853300.87405: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853300.87408: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853300.87476: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853300.89098: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 13273 1726853300.89102: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13273 1726853300.89152: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13273 1726853300.89214: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13273pukop8ph/tmpcow_qe1o /root/.ansible/tmp/ansible-tmp-1726853300.839523-14232-149410026161085/AnsiballZ_command.py <<< 13273 1726853300.89220: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853300.839523-14232-149410026161085/AnsiballZ_command.py" <<< 13273 1726853300.89266: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13273pukop8ph/tmpcow_qe1o" to remote "/root/.ansible/tmp/ansible-tmp-1726853300.839523-14232-149410026161085/AnsiballZ_command.py" <<< 13273 1726853300.89274: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853300.839523-14232-149410026161085/AnsiballZ_command.py" <<< 13273 1726853300.89874: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853300.89910: stderr chunk (state=3): >>><<< 13273 1726853300.89913: stdout chunk (state=3): >>><<< 13273 1726853300.89956: done transferring module to remote 13273 1726853300.89966: _low_level_execute_command(): starting 13273 1726853300.89969: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853300.839523-14232-149410026161085/ /root/.ansible/tmp/ansible-tmp-1726853300.839523-14232-149410026161085/AnsiballZ_command.py && sleep 0' 13273 1726853300.90397: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853300.90404: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 13273 1726853300.90406: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 13273 1726853300.90408: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 13273 1726853300.90411: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853300.90456: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853300.90461: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853300.90517: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853300.92382: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853300.92402: stderr chunk (state=3): >>><<< 13273 1726853300.92405: stdout chunk (state=3): >>><<< 13273 1726853300.92416: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853300.92419: _low_level_execute_command(): starting 13273 1726853300.92423: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853300.839523-14232-149410026161085/AnsiballZ_command.py && sleep 0' 13273 1726853300.92831: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853300.92834: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 13273 1726853300.92836: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853300.92838: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853300.92840: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853300.92891: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853300.92899: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853300.92961: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853301.10760: stdout chunk (state=3): >>> {"changed": true, "stdout": "bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection \nbond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection \nbond0 /etc/NetworkManager/system-connections/bond0.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "start": "2024-09-20 13:28:21.084984", "end": "2024-09-20 13:28:21.106170", "delta": "0:00:00.021186", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 13273 1726853301.12731: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 13273 1726853301.12735: stdout chunk (state=3): >>><<< 13273 1726853301.12738: stderr chunk (state=3): >>><<< 13273 1726853301.12740: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection \nbond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection \nbond0 /etc/NetworkManager/system-connections/bond0.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "start": "2024-09-20 13:28:21.084984", "end": "2024-09-20 13:28:21.106170", "delta": "0:00:00.021186", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 13273 1726853301.12746: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853300.839523-14232-149410026161085/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13273 1726853301.12753: _low_level_execute_command(): starting 13273 1726853301.12755: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853300.839523-14232-149410026161085/ > /dev/null 2>&1 && sleep 0' 13273 1726853301.13454: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13273 1726853301.13477: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853301.13579: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853301.13610: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853301.13709: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853301.15780: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853301.15784: stdout chunk (state=3): >>><<< 13273 1726853301.15786: stderr chunk (state=3): >>><<< 13273 1726853301.15788: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853301.15811: handler run complete 13273 1726853301.16089: Evaluated conditional (False): False 13273 1726853301.16092: attempt loop complete, returning result 13273 1726853301.16094: _execute() done 13273 1726853301.16096: dumping result to json 13273 1726853301.16098: done dumping result, returning 13273 1726853301.16101: done running TaskExecutor() for managed_node3/TASK: Get NM profile info [02083763-bbaf-5fc3-657d-0000000005e7] 13273 1726853301.16103: sending task result for task 02083763-bbaf-5fc3-657d-0000000005e7 ok: [managed_node3] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "delta": "0:00:00.021186", "end": "2024-09-20 13:28:21.106170", "rc": 0, "start": "2024-09-20 13:28:21.084984" } STDOUT: bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection bond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection bond0 /etc/NetworkManager/system-connections/bond0.nmconnection 13273 1726853301.16282: no more pending results, returning what we have 13273 1726853301.16286: results queue empty 13273 1726853301.16287: checking for any_errors_fatal 13273 1726853301.16292: done checking for any_errors_fatal 13273 1726853301.16292: checking for max_fail_percentage 13273 1726853301.16320: done checking for max_fail_percentage 13273 1726853301.16322: checking to see if all hosts have failed and the running result is not ok 13273 1726853301.16323: done checking to see if all hosts have failed 13273 1726853301.16323: getting the remaining hosts for this loop 13273 1726853301.16325: done getting the remaining hosts for this loop 13273 1726853301.16329: getting the next task for host managed_node3 13273 1726853301.16336: done getting next task for host managed_node3 13273 1726853301.16339: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 13273 1726853301.16344: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853301.16348: getting variables 13273 1726853301.16350: in VariableManager get_vars() 13273 1726853301.16528: Calling all_inventory to load vars for managed_node3 13273 1726853301.16540: Calling groups_inventory to load vars for managed_node3 13273 1726853301.16544: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853301.16594: Calling all_plugins_play to load vars for managed_node3 13273 1726853301.16597: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853301.16600: Calling groups_plugins_play to load vars for managed_node3 13273 1726853301.17136: done sending task result for task 02083763-bbaf-5fc3-657d-0000000005e7 13273 1726853301.17139: WORKER PROCESS EXITING 13273 1726853301.18593: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853301.21746: done with get_vars() 13273 1726853301.21768: done getting variables 13273 1726853301.21832: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Friday 20 September 2024 13:28:21 -0400 (0:00:00.418) 0:00:19.107 ****** 13273 1726853301.21866: entering _queue_task() for managed_node3/set_fact 13273 1726853301.22251: worker is 1 (out of 1 available) 13273 1726853301.22261: exiting _queue_task() for managed_node3/set_fact 13273 1726853301.22337: done queuing things up, now waiting for results queue to drain 13273 1726853301.22339: waiting for pending results... 13273 1726853301.22519: running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 13273 1726853301.22638: in run() - task 02083763-bbaf-5fc3-657d-0000000005e8 13273 1726853301.22661: variable 'ansible_search_path' from source: unknown 13273 1726853301.22674: variable 'ansible_search_path' from source: unknown 13273 1726853301.22773: calling self._execute() 13273 1726853301.22809: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853301.22821: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853301.22835: variable 'omit' from source: magic vars 13273 1726853301.23225: variable 'ansible_distribution_major_version' from source: facts 13273 1726853301.23242: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853301.23389: variable 'nm_profile_exists' from source: set_fact 13273 1726853301.23409: Evaluated conditional (nm_profile_exists.rc == 0): True 13273 1726853301.23430: variable 'omit' from source: magic vars 13273 1726853301.23530: variable 'omit' from source: magic vars 13273 1726853301.23534: variable 'omit' from source: magic vars 13273 1726853301.23561: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13273 1726853301.23599: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13273 1726853301.23622: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13273 1726853301.23653: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853301.23668: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853301.23747: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13273 1726853301.23752: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853301.23754: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853301.24079: Set connection var ansible_connection to ssh 13273 1726853301.24083: Set connection var ansible_module_compression to ZIP_DEFLATED 13273 1726853301.24086: Set connection var ansible_shell_executable to /bin/sh 13273 1726853301.24088: Set connection var ansible_shell_type to sh 13273 1726853301.24090: Set connection var ansible_pipelining to False 13273 1726853301.24092: Set connection var ansible_timeout to 10 13273 1726853301.24188: variable 'ansible_shell_executable' from source: unknown 13273 1726853301.24191: variable 'ansible_connection' from source: unknown 13273 1726853301.24193: variable 'ansible_module_compression' from source: unknown 13273 1726853301.24195: variable 'ansible_shell_type' from source: unknown 13273 1726853301.24197: variable 'ansible_shell_executable' from source: unknown 13273 1726853301.24199: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853301.24201: variable 'ansible_pipelining' from source: unknown 13273 1726853301.24203: variable 'ansible_timeout' from source: unknown 13273 1726853301.24206: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853301.24421: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13273 1726853301.24437: variable 'omit' from source: magic vars 13273 1726853301.24446: starting attempt loop 13273 1726853301.24453: running the handler 13273 1726853301.24468: handler run complete 13273 1726853301.24524: attempt loop complete, returning result 13273 1726853301.24531: _execute() done 13273 1726853301.24538: dumping result to json 13273 1726853301.24545: done dumping result, returning 13273 1726853301.24559: done running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [02083763-bbaf-5fc3-657d-0000000005e8] 13273 1726853301.24569: sending task result for task 02083763-bbaf-5fc3-657d-0000000005e8 ok: [managed_node3] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 13273 1726853301.24778: no more pending results, returning what we have 13273 1726853301.24782: results queue empty 13273 1726853301.24783: checking for any_errors_fatal 13273 1726853301.24792: done checking for any_errors_fatal 13273 1726853301.24793: checking for max_fail_percentage 13273 1726853301.24794: done checking for max_fail_percentage 13273 1726853301.24795: checking to see if all hosts have failed and the running result is not ok 13273 1726853301.24796: done checking to see if all hosts have failed 13273 1726853301.24796: getting the remaining hosts for this loop 13273 1726853301.24798: done getting the remaining hosts for this loop 13273 1726853301.24801: getting the next task for host managed_node3 13273 1726853301.24812: done getting next task for host managed_node3 13273 1726853301.24815: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 13273 1726853301.24820: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853301.24825: getting variables 13273 1726853301.24826: in VariableManager get_vars() 13273 1726853301.24886: Calling all_inventory to load vars for managed_node3 13273 1726853301.24889: Calling groups_inventory to load vars for managed_node3 13273 1726853301.24891: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853301.24902: Calling all_plugins_play to load vars for managed_node3 13273 1726853301.24905: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853301.24908: Calling groups_plugins_play to load vars for managed_node3 13273 1726853301.25641: done sending task result for task 02083763-bbaf-5fc3-657d-0000000005e8 13273 1726853301.25644: WORKER PROCESS EXITING 13273 1726853301.26650: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853301.28305: done with get_vars() 13273 1726853301.28333: done getting variables 13273 1726853301.28402: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 13273 1726853301.28525: variable 'profile' from source: include params 13273 1726853301.28529: variable 'item' from source: include params 13273 1726853301.28595: variable 'item' from source: include params TASK [Get the ansible_managed comment in ifcfg-bond0] ************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Friday 20 September 2024 13:28:21 -0400 (0:00:00.067) 0:00:19.175 ****** 13273 1726853301.28633: entering _queue_task() for managed_node3/command 13273 1726853301.28967: worker is 1 (out of 1 available) 13273 1726853301.29182: exiting _queue_task() for managed_node3/command 13273 1726853301.29192: done queuing things up, now waiting for results queue to drain 13273 1726853301.29193: waiting for pending results... 13273 1726853301.29282: running TaskExecutor() for managed_node3/TASK: Get the ansible_managed comment in ifcfg-bond0 13273 1726853301.29403: in run() - task 02083763-bbaf-5fc3-657d-0000000005ea 13273 1726853301.29429: variable 'ansible_search_path' from source: unknown 13273 1726853301.29437: variable 'ansible_search_path' from source: unknown 13273 1726853301.29482: calling self._execute() 13273 1726853301.29580: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853301.29593: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853301.29607: variable 'omit' from source: magic vars 13273 1726853301.29982: variable 'ansible_distribution_major_version' from source: facts 13273 1726853301.29998: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853301.30119: variable 'profile_stat' from source: set_fact 13273 1726853301.30138: Evaluated conditional (profile_stat.stat.exists): False 13273 1726853301.30150: when evaluation is False, skipping this task 13273 1726853301.30157: _execute() done 13273 1726853301.30163: dumping result to json 13273 1726853301.30169: done dumping result, returning 13273 1726853301.30478: done running TaskExecutor() for managed_node3/TASK: Get the ansible_managed comment in ifcfg-bond0 [02083763-bbaf-5fc3-657d-0000000005ea] 13273 1726853301.30482: sending task result for task 02083763-bbaf-5fc3-657d-0000000005ea 13273 1726853301.30544: done sending task result for task 02083763-bbaf-5fc3-657d-0000000005ea 13273 1726853301.30548: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 13273 1726853301.30635: no more pending results, returning what we have 13273 1726853301.30638: results queue empty 13273 1726853301.30639: checking for any_errors_fatal 13273 1726853301.30646: done checking for any_errors_fatal 13273 1726853301.30647: checking for max_fail_percentage 13273 1726853301.30649: done checking for max_fail_percentage 13273 1726853301.30650: checking to see if all hosts have failed and the running result is not ok 13273 1726853301.30650: done checking to see if all hosts have failed 13273 1726853301.30651: getting the remaining hosts for this loop 13273 1726853301.30653: done getting the remaining hosts for this loop 13273 1726853301.30656: getting the next task for host managed_node3 13273 1726853301.30661: done getting next task for host managed_node3 13273 1726853301.30664: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 13273 1726853301.30668: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853301.30674: getting variables 13273 1726853301.30675: in VariableManager get_vars() 13273 1726853301.30720: Calling all_inventory to load vars for managed_node3 13273 1726853301.30723: Calling groups_inventory to load vars for managed_node3 13273 1726853301.30725: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853301.30734: Calling all_plugins_play to load vars for managed_node3 13273 1726853301.30737: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853301.30740: Calling groups_plugins_play to load vars for managed_node3 13273 1726853301.33755: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853301.35314: done with get_vars() 13273 1726853301.35336: done getting variables 13273 1726853301.35400: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 13273 1726853301.35514: variable 'profile' from source: include params 13273 1726853301.35518: variable 'item' from source: include params 13273 1726853301.35579: variable 'item' from source: include params TASK [Verify the ansible_managed comment in ifcfg-bond0] *********************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Friday 20 September 2024 13:28:21 -0400 (0:00:00.069) 0:00:19.245 ****** 13273 1726853301.35610: entering _queue_task() for managed_node3/set_fact 13273 1726853301.35950: worker is 1 (out of 1 available) 13273 1726853301.35961: exiting _queue_task() for managed_node3/set_fact 13273 1726853301.36176: done queuing things up, now waiting for results queue to drain 13273 1726853301.36177: waiting for pending results... 13273 1726853301.36257: running TaskExecutor() for managed_node3/TASK: Verify the ansible_managed comment in ifcfg-bond0 13273 1726853301.36379: in run() - task 02083763-bbaf-5fc3-657d-0000000005eb 13273 1726853301.36404: variable 'ansible_search_path' from source: unknown 13273 1726853301.36411: variable 'ansible_search_path' from source: unknown 13273 1726853301.36455: calling self._execute() 13273 1726853301.36556: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853301.36567: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853301.36584: variable 'omit' from source: magic vars 13273 1726853301.36965: variable 'ansible_distribution_major_version' from source: facts 13273 1726853301.36983: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853301.37105: variable 'profile_stat' from source: set_fact 13273 1726853301.37125: Evaluated conditional (profile_stat.stat.exists): False 13273 1726853301.37132: when evaluation is False, skipping this task 13273 1726853301.37138: _execute() done 13273 1726853301.37148: dumping result to json 13273 1726853301.37160: done dumping result, returning 13273 1726853301.37270: done running TaskExecutor() for managed_node3/TASK: Verify the ansible_managed comment in ifcfg-bond0 [02083763-bbaf-5fc3-657d-0000000005eb] 13273 1726853301.37275: sending task result for task 02083763-bbaf-5fc3-657d-0000000005eb 13273 1726853301.37341: done sending task result for task 02083763-bbaf-5fc3-657d-0000000005eb 13273 1726853301.37346: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 13273 1726853301.37422: no more pending results, returning what we have 13273 1726853301.37426: results queue empty 13273 1726853301.37427: checking for any_errors_fatal 13273 1726853301.37432: done checking for any_errors_fatal 13273 1726853301.37433: checking for max_fail_percentage 13273 1726853301.37434: done checking for max_fail_percentage 13273 1726853301.37435: checking to see if all hosts have failed and the running result is not ok 13273 1726853301.37436: done checking to see if all hosts have failed 13273 1726853301.37436: getting the remaining hosts for this loop 13273 1726853301.37438: done getting the remaining hosts for this loop 13273 1726853301.37441: getting the next task for host managed_node3 13273 1726853301.37451: done getting next task for host managed_node3 13273 1726853301.37454: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 13273 1726853301.37460: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853301.37464: getting variables 13273 1726853301.37466: in VariableManager get_vars() 13273 1726853301.37521: Calling all_inventory to load vars for managed_node3 13273 1726853301.37524: Calling groups_inventory to load vars for managed_node3 13273 1726853301.37527: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853301.37539: Calling all_plugins_play to load vars for managed_node3 13273 1726853301.37545: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853301.37548: Calling groups_plugins_play to load vars for managed_node3 13273 1726853301.39064: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853301.40955: done with get_vars() 13273 1726853301.41181: done getting variables 13273 1726853301.41240: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 13273 1726853301.41353: variable 'profile' from source: include params 13273 1726853301.41357: variable 'item' from source: include params 13273 1726853301.41624: variable 'item' from source: include params TASK [Get the fingerprint comment in ifcfg-bond0] ****************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Friday 20 September 2024 13:28:21 -0400 (0:00:00.060) 0:00:19.305 ****** 13273 1726853301.41657: entering _queue_task() for managed_node3/command 13273 1726853301.42427: worker is 1 (out of 1 available) 13273 1726853301.42439: exiting _queue_task() for managed_node3/command 13273 1726853301.42454: done queuing things up, now waiting for results queue to drain 13273 1726853301.42455: waiting for pending results... 13273 1726853301.42818: running TaskExecutor() for managed_node3/TASK: Get the fingerprint comment in ifcfg-bond0 13273 1726853301.43023: in run() - task 02083763-bbaf-5fc3-657d-0000000005ec 13273 1726853301.43028: variable 'ansible_search_path' from source: unknown 13273 1726853301.43031: variable 'ansible_search_path' from source: unknown 13273 1726853301.43238: calling self._execute() 13273 1726853301.43375: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853301.43378: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853301.43381: variable 'omit' from source: magic vars 13273 1726853301.44338: variable 'ansible_distribution_major_version' from source: facts 13273 1726853301.44342: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853301.44445: variable 'profile_stat' from source: set_fact 13273 1726853301.44775: Evaluated conditional (profile_stat.stat.exists): False 13273 1726853301.44779: when evaluation is False, skipping this task 13273 1726853301.44783: _execute() done 13273 1726853301.44786: dumping result to json 13273 1726853301.44789: done dumping result, returning 13273 1726853301.44796: done running TaskExecutor() for managed_node3/TASK: Get the fingerprint comment in ifcfg-bond0 [02083763-bbaf-5fc3-657d-0000000005ec] 13273 1726853301.44799: sending task result for task 02083763-bbaf-5fc3-657d-0000000005ec 13273 1726853301.44861: done sending task result for task 02083763-bbaf-5fc3-657d-0000000005ec 13273 1726853301.44864: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 13273 1726853301.44951: no more pending results, returning what we have 13273 1726853301.44955: results queue empty 13273 1726853301.44956: checking for any_errors_fatal 13273 1726853301.44962: done checking for any_errors_fatal 13273 1726853301.44963: checking for max_fail_percentage 13273 1726853301.44965: done checking for max_fail_percentage 13273 1726853301.44965: checking to see if all hosts have failed and the running result is not ok 13273 1726853301.44966: done checking to see if all hosts have failed 13273 1726853301.44967: getting the remaining hosts for this loop 13273 1726853301.44968: done getting the remaining hosts for this loop 13273 1726853301.44973: getting the next task for host managed_node3 13273 1726853301.44980: done getting next task for host managed_node3 13273 1726853301.44982: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 13273 1726853301.44987: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853301.44991: getting variables 13273 1726853301.44993: in VariableManager get_vars() 13273 1726853301.45049: Calling all_inventory to load vars for managed_node3 13273 1726853301.45053: Calling groups_inventory to load vars for managed_node3 13273 1726853301.45055: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853301.45068: Calling all_plugins_play to load vars for managed_node3 13273 1726853301.45274: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853301.45279: Calling groups_plugins_play to load vars for managed_node3 13273 1726853301.48110: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853301.50480: done with get_vars() 13273 1726853301.50506: done getting variables 13273 1726853301.50579: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 13273 1726853301.50696: variable 'profile' from source: include params 13273 1726853301.50700: variable 'item' from source: include params 13273 1726853301.50778: variable 'item' from source: include params TASK [Verify the fingerprint comment in ifcfg-bond0] *************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Friday 20 September 2024 13:28:21 -0400 (0:00:00.091) 0:00:19.397 ****** 13273 1726853301.50807: entering _queue_task() for managed_node3/set_fact 13273 1726853301.51170: worker is 1 (out of 1 available) 13273 1726853301.51385: exiting _queue_task() for managed_node3/set_fact 13273 1726853301.51397: done queuing things up, now waiting for results queue to drain 13273 1726853301.51397: waiting for pending results... 13273 1726853301.51503: running TaskExecutor() for managed_node3/TASK: Verify the fingerprint comment in ifcfg-bond0 13273 1726853301.51628: in run() - task 02083763-bbaf-5fc3-657d-0000000005ed 13273 1726853301.51650: variable 'ansible_search_path' from source: unknown 13273 1726853301.51658: variable 'ansible_search_path' from source: unknown 13273 1726853301.51730: calling self._execute() 13273 1726853301.51854: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853301.51865: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853301.51883: variable 'omit' from source: magic vars 13273 1726853301.52378: variable 'ansible_distribution_major_version' from source: facts 13273 1726853301.52384: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853301.52787: variable 'profile_stat' from source: set_fact 13273 1726853301.52790: Evaluated conditional (profile_stat.stat.exists): False 13273 1726853301.52792: when evaluation is False, skipping this task 13273 1726853301.52794: _execute() done 13273 1726853301.52796: dumping result to json 13273 1726853301.52798: done dumping result, returning 13273 1726853301.52801: done running TaskExecutor() for managed_node3/TASK: Verify the fingerprint comment in ifcfg-bond0 [02083763-bbaf-5fc3-657d-0000000005ed] 13273 1726853301.52803: sending task result for task 02083763-bbaf-5fc3-657d-0000000005ed 13273 1726853301.52867: done sending task result for task 02083763-bbaf-5fc3-657d-0000000005ed 13273 1726853301.52872: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 13273 1726853301.52921: no more pending results, returning what we have 13273 1726853301.52926: results queue empty 13273 1726853301.52927: checking for any_errors_fatal 13273 1726853301.52932: done checking for any_errors_fatal 13273 1726853301.52933: checking for max_fail_percentage 13273 1726853301.52935: done checking for max_fail_percentage 13273 1726853301.52936: checking to see if all hosts have failed and the running result is not ok 13273 1726853301.52936: done checking to see if all hosts have failed 13273 1726853301.52937: getting the remaining hosts for this loop 13273 1726853301.52939: done getting the remaining hosts for this loop 13273 1726853301.52945: getting the next task for host managed_node3 13273 1726853301.52953: done getting next task for host managed_node3 13273 1726853301.52956: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 13273 1726853301.52960: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853301.52965: getting variables 13273 1726853301.52967: in VariableManager get_vars() 13273 1726853301.53026: Calling all_inventory to load vars for managed_node3 13273 1726853301.53029: Calling groups_inventory to load vars for managed_node3 13273 1726853301.53032: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853301.53048: Calling all_plugins_play to load vars for managed_node3 13273 1726853301.53051: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853301.53055: Calling groups_plugins_play to load vars for managed_node3 13273 1726853301.56600: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853301.60711: done with get_vars() 13273 1726853301.60745: done getting variables 13273 1726853301.60810: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 13273 1726853301.61138: variable 'profile' from source: include params 13273 1726853301.61144: variable 'item' from source: include params 13273 1726853301.61209: variable 'item' from source: include params TASK [Assert that the profile is present - 'bond0'] **************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Friday 20 September 2024 13:28:21 -0400 (0:00:00.104) 0:00:19.501 ****** 13273 1726853301.61239: entering _queue_task() for managed_node3/assert 13273 1726853301.62027: worker is 1 (out of 1 available) 13273 1726853301.62040: exiting _queue_task() for managed_node3/assert 13273 1726853301.62055: done queuing things up, now waiting for results queue to drain 13273 1726853301.62056: waiting for pending results... 13273 1726853301.63096: running TaskExecutor() for managed_node3/TASK: Assert that the profile is present - 'bond0' 13273 1726853301.63580: in run() - task 02083763-bbaf-5fc3-657d-000000000356 13273 1726853301.63585: variable 'ansible_search_path' from source: unknown 13273 1726853301.63588: variable 'ansible_search_path' from source: unknown 13273 1726853301.63664: calling self._execute() 13273 1726853301.64177: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853301.64181: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853301.64185: variable 'omit' from source: magic vars 13273 1726853301.64888: variable 'ansible_distribution_major_version' from source: facts 13273 1726853301.64908: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853301.64920: variable 'omit' from source: magic vars 13273 1726853301.64963: variable 'omit' from source: magic vars 13273 1726853301.65163: variable 'profile' from source: include params 13273 1726853301.65219: variable 'item' from source: include params 13273 1726853301.65287: variable 'item' from source: include params 13273 1726853301.65539: variable 'omit' from source: magic vars 13273 1726853301.65542: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13273 1726853301.65545: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13273 1726853301.65591: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13273 1726853301.65668: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853301.65689: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853301.65726: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13273 1726853301.65865: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853301.65869: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853301.66092: Set connection var ansible_connection to ssh 13273 1726853301.66109: Set connection var ansible_module_compression to ZIP_DEFLATED 13273 1726853301.66121: Set connection var ansible_shell_executable to /bin/sh 13273 1726853301.66129: Set connection var ansible_shell_type to sh 13273 1726853301.66141: Set connection var ansible_pipelining to False 13273 1726853301.66152: Set connection var ansible_timeout to 10 13273 1726853301.66192: variable 'ansible_shell_executable' from source: unknown 13273 1726853301.66253: variable 'ansible_connection' from source: unknown 13273 1726853301.66261: variable 'ansible_module_compression' from source: unknown 13273 1726853301.66267: variable 'ansible_shell_type' from source: unknown 13273 1726853301.66277: variable 'ansible_shell_executable' from source: unknown 13273 1726853301.66285: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853301.66293: variable 'ansible_pipelining' from source: unknown 13273 1726853301.66307: variable 'ansible_timeout' from source: unknown 13273 1726853301.66316: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853301.66461: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13273 1726853301.66481: variable 'omit' from source: magic vars 13273 1726853301.66491: starting attempt loop 13273 1726853301.66497: running the handler 13273 1726853301.66616: variable 'lsr_net_profile_exists' from source: set_fact 13273 1726853301.66736: Evaluated conditional (lsr_net_profile_exists): True 13273 1726853301.66740: handler run complete 13273 1726853301.66743: attempt loop complete, returning result 13273 1726853301.66745: _execute() done 13273 1726853301.66748: dumping result to json 13273 1726853301.66751: done dumping result, returning 13273 1726853301.66754: done running TaskExecutor() for managed_node3/TASK: Assert that the profile is present - 'bond0' [02083763-bbaf-5fc3-657d-000000000356] 13273 1726853301.66757: sending task result for task 02083763-bbaf-5fc3-657d-000000000356 13273 1726853301.66826: done sending task result for task 02083763-bbaf-5fc3-657d-000000000356 13273 1726853301.66830: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 13273 1726853301.66883: no more pending results, returning what we have 13273 1726853301.66886: results queue empty 13273 1726853301.66887: checking for any_errors_fatal 13273 1726853301.66893: done checking for any_errors_fatal 13273 1726853301.66893: checking for max_fail_percentage 13273 1726853301.66895: done checking for max_fail_percentage 13273 1726853301.66896: checking to see if all hosts have failed and the running result is not ok 13273 1726853301.66896: done checking to see if all hosts have failed 13273 1726853301.66897: getting the remaining hosts for this loop 13273 1726853301.66899: done getting the remaining hosts for this loop 13273 1726853301.66901: getting the next task for host managed_node3 13273 1726853301.66907: done getting next task for host managed_node3 13273 1726853301.66909: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 13273 1726853301.66913: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853301.66918: getting variables 13273 1726853301.66919: in VariableManager get_vars() 13273 1726853301.66978: Calling all_inventory to load vars for managed_node3 13273 1726853301.66981: Calling groups_inventory to load vars for managed_node3 13273 1726853301.66983: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853301.66993: Calling all_plugins_play to load vars for managed_node3 13273 1726853301.66996: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853301.66999: Calling groups_plugins_play to load vars for managed_node3 13273 1726853301.69659: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853301.72813: done with get_vars() 13273 1726853301.72845: done getting variables 13273 1726853301.72908: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 13273 1726853301.73229: variable 'profile' from source: include params 13273 1726853301.73233: variable 'item' from source: include params 13273 1726853301.73295: variable 'item' from source: include params TASK [Assert that the ansible managed comment is present in 'bond0'] *********** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Friday 20 September 2024 13:28:21 -0400 (0:00:00.120) 0:00:19.622 ****** 13273 1726853301.73332: entering _queue_task() for managed_node3/assert 13273 1726853301.74091: worker is 1 (out of 1 available) 13273 1726853301.74104: exiting _queue_task() for managed_node3/assert 13273 1726853301.74116: done queuing things up, now waiting for results queue to drain 13273 1726853301.74117: waiting for pending results... 13273 1726853301.74788: running TaskExecutor() for managed_node3/TASK: Assert that the ansible managed comment is present in 'bond0' 13273 1726853301.74843: in run() - task 02083763-bbaf-5fc3-657d-000000000357 13273 1726853301.74935: variable 'ansible_search_path' from source: unknown 13273 1726853301.74943: variable 'ansible_search_path' from source: unknown 13273 1726853301.75180: calling self._execute() 13273 1726853301.75253: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853301.75265: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853301.75281: variable 'omit' from source: magic vars 13273 1726853301.76036: variable 'ansible_distribution_major_version' from source: facts 13273 1726853301.76078: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853301.76123: variable 'omit' from source: magic vars 13273 1726853301.76237: variable 'omit' from source: magic vars 13273 1726853301.76476: variable 'profile' from source: include params 13273 1726853301.76486: variable 'item' from source: include params 13273 1726853301.76670: variable 'item' from source: include params 13273 1726853301.76697: variable 'omit' from source: magic vars 13273 1726853301.76891: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13273 1726853301.76917: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13273 1726853301.76961: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13273 1726853301.77024: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853301.77086: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853301.77145: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13273 1726853301.77183: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853301.77191: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853301.77425: Set connection var ansible_connection to ssh 13273 1726853301.77667: Set connection var ansible_module_compression to ZIP_DEFLATED 13273 1726853301.77670: Set connection var ansible_shell_executable to /bin/sh 13273 1726853301.77675: Set connection var ansible_shell_type to sh 13273 1726853301.77678: Set connection var ansible_pipelining to False 13273 1726853301.77680: Set connection var ansible_timeout to 10 13273 1726853301.77682: variable 'ansible_shell_executable' from source: unknown 13273 1726853301.77684: variable 'ansible_connection' from source: unknown 13273 1726853301.77686: variable 'ansible_module_compression' from source: unknown 13273 1726853301.77688: variable 'ansible_shell_type' from source: unknown 13273 1726853301.77690: variable 'ansible_shell_executable' from source: unknown 13273 1726853301.77692: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853301.77694: variable 'ansible_pipelining' from source: unknown 13273 1726853301.77697: variable 'ansible_timeout' from source: unknown 13273 1726853301.77699: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853301.77932: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13273 1726853301.77988: variable 'omit' from source: magic vars 13273 1726853301.78376: starting attempt loop 13273 1726853301.78384: running the handler 13273 1726853301.78420: variable 'lsr_net_profile_ansible_managed' from source: set_fact 13273 1726853301.78430: Evaluated conditional (lsr_net_profile_ansible_managed): True 13273 1726853301.78440: handler run complete 13273 1726853301.78460: attempt loop complete, returning result 13273 1726853301.78467: _execute() done 13273 1726853301.78476: dumping result to json 13273 1726853301.78483: done dumping result, returning 13273 1726853301.78495: done running TaskExecutor() for managed_node3/TASK: Assert that the ansible managed comment is present in 'bond0' [02083763-bbaf-5fc3-657d-000000000357] 13273 1726853301.78503: sending task result for task 02083763-bbaf-5fc3-657d-000000000357 ok: [managed_node3] => { "changed": false } MSG: All assertions passed 13273 1726853301.78649: no more pending results, returning what we have 13273 1726853301.78652: results queue empty 13273 1726853301.78653: checking for any_errors_fatal 13273 1726853301.78659: done checking for any_errors_fatal 13273 1726853301.78660: checking for max_fail_percentage 13273 1726853301.78662: done checking for max_fail_percentage 13273 1726853301.78662: checking to see if all hosts have failed and the running result is not ok 13273 1726853301.78663: done checking to see if all hosts have failed 13273 1726853301.78664: getting the remaining hosts for this loop 13273 1726853301.78665: done getting the remaining hosts for this loop 13273 1726853301.78668: getting the next task for host managed_node3 13273 1726853301.78676: done getting next task for host managed_node3 13273 1726853301.78679: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 13273 1726853301.78683: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853301.78688: getting variables 13273 1726853301.78690: in VariableManager get_vars() 13273 1726853301.78749: Calling all_inventory to load vars for managed_node3 13273 1726853301.78753: Calling groups_inventory to load vars for managed_node3 13273 1726853301.78755: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853301.78763: done sending task result for task 02083763-bbaf-5fc3-657d-000000000357 13273 1726853301.78766: WORKER PROCESS EXITING 13273 1726853301.78777: Calling all_plugins_play to load vars for managed_node3 13273 1726853301.78780: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853301.78783: Calling groups_plugins_play to load vars for managed_node3 13273 1726853301.80312: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853301.82048: done with get_vars() 13273 1726853301.82074: done getting variables 13273 1726853301.82379: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 13273 1726853301.82495: variable 'profile' from source: include params 13273 1726853301.82499: variable 'item' from source: include params 13273 1726853301.82562: variable 'item' from source: include params TASK [Assert that the fingerprint comment is present in bond0] ***************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Friday 20 September 2024 13:28:21 -0400 (0:00:00.092) 0:00:19.715 ****** 13273 1726853301.82616: entering _queue_task() for managed_node3/assert 13273 1726853301.82966: worker is 1 (out of 1 available) 13273 1726853301.83084: exiting _queue_task() for managed_node3/assert 13273 1726853301.83096: done queuing things up, now waiting for results queue to drain 13273 1726853301.83097: waiting for pending results... 13273 1726853301.83284: running TaskExecutor() for managed_node3/TASK: Assert that the fingerprint comment is present in bond0 13273 1726853301.83422: in run() - task 02083763-bbaf-5fc3-657d-000000000358 13273 1726853301.83449: variable 'ansible_search_path' from source: unknown 13273 1726853301.83456: variable 'ansible_search_path' from source: unknown 13273 1726853301.83498: calling self._execute() 13273 1726853301.83623: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853301.83664: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853301.83685: variable 'omit' from source: magic vars 13273 1726853301.84276: variable 'ansible_distribution_major_version' from source: facts 13273 1726853301.84280: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853301.84283: variable 'omit' from source: magic vars 13273 1726853301.84284: variable 'omit' from source: magic vars 13273 1726853301.84286: variable 'profile' from source: include params 13273 1726853301.84288: variable 'item' from source: include params 13273 1726853301.84306: variable 'item' from source: include params 13273 1726853301.84326: variable 'omit' from source: magic vars 13273 1726853301.84373: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13273 1726853301.84424: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13273 1726853301.84454: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13273 1726853301.84482: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853301.84499: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853301.84545: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13273 1726853301.84555: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853301.84563: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853301.84680: Set connection var ansible_connection to ssh 13273 1726853301.84697: Set connection var ansible_module_compression to ZIP_DEFLATED 13273 1726853301.84707: Set connection var ansible_shell_executable to /bin/sh 13273 1726853301.84714: Set connection var ansible_shell_type to sh 13273 1726853301.84729: Set connection var ansible_pipelining to False 13273 1726853301.84740: Set connection var ansible_timeout to 10 13273 1726853301.84776: variable 'ansible_shell_executable' from source: unknown 13273 1726853301.84785: variable 'ansible_connection' from source: unknown 13273 1726853301.84792: variable 'ansible_module_compression' from source: unknown 13273 1726853301.84799: variable 'ansible_shell_type' from source: unknown 13273 1726853301.84806: variable 'ansible_shell_executable' from source: unknown 13273 1726853301.84812: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853301.84820: variable 'ansible_pipelining' from source: unknown 13273 1726853301.84827: variable 'ansible_timeout' from source: unknown 13273 1726853301.84840: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853301.85002: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13273 1726853301.85020: variable 'omit' from source: magic vars 13273 1726853301.85031: starting attempt loop 13273 1726853301.85038: running the handler 13273 1726853301.85164: variable 'lsr_net_profile_fingerprint' from source: set_fact 13273 1726853301.85270: Evaluated conditional (lsr_net_profile_fingerprint): True 13273 1726853301.85274: handler run complete 13273 1726853301.85277: attempt loop complete, returning result 13273 1726853301.85279: _execute() done 13273 1726853301.85281: dumping result to json 13273 1726853301.85284: done dumping result, returning 13273 1726853301.85286: done running TaskExecutor() for managed_node3/TASK: Assert that the fingerprint comment is present in bond0 [02083763-bbaf-5fc3-657d-000000000358] 13273 1726853301.85288: sending task result for task 02083763-bbaf-5fc3-657d-000000000358 13273 1726853301.85359: done sending task result for task 02083763-bbaf-5fc3-657d-000000000358 13273 1726853301.85363: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 13273 1726853301.85427: no more pending results, returning what we have 13273 1726853301.85431: results queue empty 13273 1726853301.85433: checking for any_errors_fatal 13273 1726853301.85439: done checking for any_errors_fatal 13273 1726853301.85440: checking for max_fail_percentage 13273 1726853301.85444: done checking for max_fail_percentage 13273 1726853301.85446: checking to see if all hosts have failed and the running result is not ok 13273 1726853301.85446: done checking to see if all hosts have failed 13273 1726853301.85447: getting the remaining hosts for this loop 13273 1726853301.85449: done getting the remaining hosts for this loop 13273 1726853301.85452: getting the next task for host managed_node3 13273 1726853301.85461: done getting next task for host managed_node3 13273 1726853301.85465: ^ task is: TASK: Include the task 'get_profile_stat.yml' 13273 1726853301.85469: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853301.85476: getting variables 13273 1726853301.85477: in VariableManager get_vars() 13273 1726853301.85540: Calling all_inventory to load vars for managed_node3 13273 1726853301.85546: Calling groups_inventory to load vars for managed_node3 13273 1726853301.85550: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853301.85562: Calling all_plugins_play to load vars for managed_node3 13273 1726853301.85566: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853301.85797: Calling groups_plugins_play to load vars for managed_node3 13273 1726853301.87452: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853301.88978: done with get_vars() 13273 1726853301.89002: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Friday 20 September 2024 13:28:21 -0400 (0:00:00.064) 0:00:19.780 ****** 13273 1726853301.89103: entering _queue_task() for managed_node3/include_tasks 13273 1726853301.89439: worker is 1 (out of 1 available) 13273 1726853301.89455: exiting _queue_task() for managed_node3/include_tasks 13273 1726853301.89468: done queuing things up, now waiting for results queue to drain 13273 1726853301.89469: waiting for pending results... 13273 1726853301.89790: running TaskExecutor() for managed_node3/TASK: Include the task 'get_profile_stat.yml' 13273 1726853301.89877: in run() - task 02083763-bbaf-5fc3-657d-00000000035c 13273 1726853301.89881: variable 'ansible_search_path' from source: unknown 13273 1726853301.89884: variable 'ansible_search_path' from source: unknown 13273 1726853301.89892: calling self._execute() 13273 1726853301.89995: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853301.90007: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853301.90028: variable 'omit' from source: magic vars 13273 1726853301.90405: variable 'ansible_distribution_major_version' from source: facts 13273 1726853301.90423: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853301.90450: _execute() done 13273 1726853301.90454: dumping result to json 13273 1726853301.90456: done dumping result, returning 13273 1726853301.90463: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_profile_stat.yml' [02083763-bbaf-5fc3-657d-00000000035c] 13273 1726853301.90560: sending task result for task 02083763-bbaf-5fc3-657d-00000000035c 13273 1726853301.90630: done sending task result for task 02083763-bbaf-5fc3-657d-00000000035c 13273 1726853301.90634: WORKER PROCESS EXITING 13273 1726853301.90692: no more pending results, returning what we have 13273 1726853301.90697: in VariableManager get_vars() 13273 1726853301.90765: Calling all_inventory to load vars for managed_node3 13273 1726853301.90768: Calling groups_inventory to load vars for managed_node3 13273 1726853301.90773: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853301.90788: Calling all_plugins_play to load vars for managed_node3 13273 1726853301.90792: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853301.90795: Calling groups_plugins_play to load vars for managed_node3 13273 1726853301.92418: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853301.93882: done with get_vars() 13273 1726853301.93898: variable 'ansible_search_path' from source: unknown 13273 1726853301.93900: variable 'ansible_search_path' from source: unknown 13273 1726853301.93934: we have included files to process 13273 1726853301.93935: generating all_blocks data 13273 1726853301.93937: done generating all_blocks data 13273 1726853301.93941: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 13273 1726853301.93945: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 13273 1726853301.93947: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 13273 1726853301.94839: done processing included file 13273 1726853301.94841: iterating over new_blocks loaded from include file 13273 1726853301.94845: in VariableManager get_vars() 13273 1726853301.94875: done with get_vars() 13273 1726853301.94876: filtering new block on tags 13273 1726853301.94899: done filtering new block on tags 13273 1726853301.94902: in VariableManager get_vars() 13273 1726853301.94930: done with get_vars() 13273 1726853301.94931: filtering new block on tags 13273 1726853301.94955: done filtering new block on tags 13273 1726853301.94957: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node3 13273 1726853301.94962: extending task lists for all hosts with included blocks 13273 1726853301.95130: done extending task lists 13273 1726853301.95132: done processing included files 13273 1726853301.95132: results queue empty 13273 1726853301.95133: checking for any_errors_fatal 13273 1726853301.95136: done checking for any_errors_fatal 13273 1726853301.95137: checking for max_fail_percentage 13273 1726853301.95138: done checking for max_fail_percentage 13273 1726853301.95139: checking to see if all hosts have failed and the running result is not ok 13273 1726853301.95140: done checking to see if all hosts have failed 13273 1726853301.95140: getting the remaining hosts for this loop 13273 1726853301.95141: done getting the remaining hosts for this loop 13273 1726853301.95146: getting the next task for host managed_node3 13273 1726853301.95150: done getting next task for host managed_node3 13273 1726853301.95152: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 13273 1726853301.95156: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853301.95158: getting variables 13273 1726853301.95159: in VariableManager get_vars() 13273 1726853301.95178: Calling all_inventory to load vars for managed_node3 13273 1726853301.95180: Calling groups_inventory to load vars for managed_node3 13273 1726853301.95182: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853301.95187: Calling all_plugins_play to load vars for managed_node3 13273 1726853301.95189: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853301.95192: Calling groups_plugins_play to load vars for managed_node3 13273 1726853301.96347: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853301.97840: done with get_vars() 13273 1726853301.97864: done getting variables 13273 1726853301.97907: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Friday 20 September 2024 13:28:21 -0400 (0:00:00.088) 0:00:19.868 ****** 13273 1726853301.97938: entering _queue_task() for managed_node3/set_fact 13273 1726853301.98293: worker is 1 (out of 1 available) 13273 1726853301.98308: exiting _queue_task() for managed_node3/set_fact 13273 1726853301.98322: done queuing things up, now waiting for results queue to drain 13273 1726853301.98323: waiting for pending results... 13273 1726853301.98691: running TaskExecutor() for managed_node3/TASK: Initialize NM profile exist and ansible_managed comment flag 13273 1726853301.98735: in run() - task 02083763-bbaf-5fc3-657d-00000000062c 13273 1726853301.98761: variable 'ansible_search_path' from source: unknown 13273 1726853301.98770: variable 'ansible_search_path' from source: unknown 13273 1726853301.98817: calling self._execute() 13273 1726853301.98923: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853301.98936: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853301.98953: variable 'omit' from source: magic vars 13273 1726853301.99337: variable 'ansible_distribution_major_version' from source: facts 13273 1726853301.99576: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853301.99580: variable 'omit' from source: magic vars 13273 1726853301.99582: variable 'omit' from source: magic vars 13273 1726853301.99584: variable 'omit' from source: magic vars 13273 1726853301.99586: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13273 1726853301.99589: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13273 1726853301.99591: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13273 1726853301.99593: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853301.99600: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853301.99635: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13273 1726853301.99646: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853301.99655: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853301.99763: Set connection var ansible_connection to ssh 13273 1726853301.99782: Set connection var ansible_module_compression to ZIP_DEFLATED 13273 1726853301.99794: Set connection var ansible_shell_executable to /bin/sh 13273 1726853301.99802: Set connection var ansible_shell_type to sh 13273 1726853301.99816: Set connection var ansible_pipelining to False 13273 1726853301.99827: Set connection var ansible_timeout to 10 13273 1726853301.99856: variable 'ansible_shell_executable' from source: unknown 13273 1726853301.99863: variable 'ansible_connection' from source: unknown 13273 1726853301.99869: variable 'ansible_module_compression' from source: unknown 13273 1726853301.99877: variable 'ansible_shell_type' from source: unknown 13273 1726853301.99882: variable 'ansible_shell_executable' from source: unknown 13273 1726853301.99887: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853301.99893: variable 'ansible_pipelining' from source: unknown 13273 1726853301.99898: variable 'ansible_timeout' from source: unknown 13273 1726853301.99903: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853302.00038: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13273 1726853302.00056: variable 'omit' from source: magic vars 13273 1726853302.00064: starting attempt loop 13273 1726853302.00069: running the handler 13273 1726853302.00085: handler run complete 13273 1726853302.00098: attempt loop complete, returning result 13273 1726853302.00103: _execute() done 13273 1726853302.00108: dumping result to json 13273 1726853302.00114: done dumping result, returning 13273 1726853302.00124: done running TaskExecutor() for managed_node3/TASK: Initialize NM profile exist and ansible_managed comment flag [02083763-bbaf-5fc3-657d-00000000062c] 13273 1726853302.00133: sending task result for task 02083763-bbaf-5fc3-657d-00000000062c ok: [managed_node3] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 13273 1726853302.00300: no more pending results, returning what we have 13273 1726853302.00304: results queue empty 13273 1726853302.00305: checking for any_errors_fatal 13273 1726853302.00307: done checking for any_errors_fatal 13273 1726853302.00308: checking for max_fail_percentage 13273 1726853302.00309: done checking for max_fail_percentage 13273 1726853302.00310: checking to see if all hosts have failed and the running result is not ok 13273 1726853302.00310: done checking to see if all hosts have failed 13273 1726853302.00311: getting the remaining hosts for this loop 13273 1726853302.00313: done getting the remaining hosts for this loop 13273 1726853302.00316: getting the next task for host managed_node3 13273 1726853302.00322: done getting next task for host managed_node3 13273 1726853302.00324: ^ task is: TASK: Stat profile file 13273 1726853302.00330: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853302.00335: getting variables 13273 1726853302.00336: in VariableManager get_vars() 13273 1726853302.00393: Calling all_inventory to load vars for managed_node3 13273 1726853302.00396: Calling groups_inventory to load vars for managed_node3 13273 1726853302.00400: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853302.00412: Calling all_plugins_play to load vars for managed_node3 13273 1726853302.00415: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853302.00417: Calling groups_plugins_play to load vars for managed_node3 13273 1726853302.01131: done sending task result for task 02083763-bbaf-5fc3-657d-00000000062c 13273 1726853302.01135: WORKER PROCESS EXITING 13273 1726853302.06210: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853302.07718: done with get_vars() 13273 1726853302.07742: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Friday 20 September 2024 13:28:22 -0400 (0:00:00.098) 0:00:19.967 ****** 13273 1726853302.07829: entering _queue_task() for managed_node3/stat 13273 1726853302.08185: worker is 1 (out of 1 available) 13273 1726853302.08197: exiting _queue_task() for managed_node3/stat 13273 1726853302.08210: done queuing things up, now waiting for results queue to drain 13273 1726853302.08211: waiting for pending results... 13273 1726853302.08491: running TaskExecutor() for managed_node3/TASK: Stat profile file 13273 1726853302.08609: in run() - task 02083763-bbaf-5fc3-657d-00000000062d 13273 1726853302.08628: variable 'ansible_search_path' from source: unknown 13273 1726853302.08634: variable 'ansible_search_path' from source: unknown 13273 1726853302.08675: calling self._execute() 13273 1726853302.08775: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853302.08788: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853302.08805: variable 'omit' from source: magic vars 13273 1726853302.09192: variable 'ansible_distribution_major_version' from source: facts 13273 1726853302.09212: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853302.09225: variable 'omit' from source: magic vars 13273 1726853302.09281: variable 'omit' from source: magic vars 13273 1726853302.09388: variable 'profile' from source: include params 13273 1726853302.09398: variable 'item' from source: include params 13273 1726853302.09468: variable 'item' from source: include params 13273 1726853302.09576: variable 'omit' from source: magic vars 13273 1726853302.09581: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13273 1726853302.09584: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13273 1726853302.09608: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13273 1726853302.09632: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853302.09652: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853302.09688: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13273 1726853302.09702: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853302.09776: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853302.09820: Set connection var ansible_connection to ssh 13273 1726853302.09837: Set connection var ansible_module_compression to ZIP_DEFLATED 13273 1726853302.09853: Set connection var ansible_shell_executable to /bin/sh 13273 1726853302.09861: Set connection var ansible_shell_type to sh 13273 1726853302.09874: Set connection var ansible_pipelining to False 13273 1726853302.09884: Set connection var ansible_timeout to 10 13273 1726853302.09921: variable 'ansible_shell_executable' from source: unknown 13273 1726853302.09930: variable 'ansible_connection' from source: unknown 13273 1726853302.09939: variable 'ansible_module_compression' from source: unknown 13273 1726853302.09949: variable 'ansible_shell_type' from source: unknown 13273 1726853302.09957: variable 'ansible_shell_executable' from source: unknown 13273 1726853302.09965: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853302.10023: variable 'ansible_pipelining' from source: unknown 13273 1726853302.10026: variable 'ansible_timeout' from source: unknown 13273 1726853302.10029: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853302.10196: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 13273 1726853302.10212: variable 'omit' from source: magic vars 13273 1726853302.10224: starting attempt loop 13273 1726853302.10232: running the handler 13273 1726853302.10259: _low_level_execute_command(): starting 13273 1726853302.10274: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13273 1726853302.11092: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13273 1726853302.11099: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853302.11134: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853302.11162: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853302.11182: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853302.11274: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853302.12969: stdout chunk (state=3): >>>/root <<< 13273 1726853302.13130: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853302.13133: stdout chunk (state=3): >>><<< 13273 1726853302.13136: stderr chunk (state=3): >>><<< 13273 1726853302.13156: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853302.13177: _low_level_execute_command(): starting 13273 1726853302.13188: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853302.131627-14273-175335695615081 `" && echo ansible-tmp-1726853302.131627-14273-175335695615081="` echo /root/.ansible/tmp/ansible-tmp-1726853302.131627-14273-175335695615081 `" ) && sleep 0' 13273 1726853302.13801: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13273 1726853302.13823: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853302.13840: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853302.13859: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13273 1726853302.13898: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 13273 1726853302.13940: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 13273 1726853302.13956: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853302.14037: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853302.14056: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853302.14078: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853302.14168: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853302.16187: stdout chunk (state=3): >>>ansible-tmp-1726853302.131627-14273-175335695615081=/root/.ansible/tmp/ansible-tmp-1726853302.131627-14273-175335695615081 <<< 13273 1726853302.16396: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853302.16400: stdout chunk (state=3): >>><<< 13273 1726853302.16403: stderr chunk (state=3): >>><<< 13273 1726853302.16424: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853302.131627-14273-175335695615081=/root/.ansible/tmp/ansible-tmp-1726853302.131627-14273-175335695615081 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853302.16577: variable 'ansible_module_compression' from source: unknown 13273 1726853302.16580: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13273pukop8ph/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 13273 1726853302.16612: variable 'ansible_facts' from source: unknown 13273 1726853302.16709: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853302.131627-14273-175335695615081/AnsiballZ_stat.py 13273 1726853302.16901: Sending initial data 13273 1726853302.16904: Sent initial data (152 bytes) 13273 1726853302.17501: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13273 1726853302.17584: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853302.17605: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853302.17617: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853302.17626: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853302.17721: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853302.19754: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13273 1726853302.19812: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13273 1726853302.19882: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13273pukop8ph/tmpj2h5pbm2 /root/.ansible/tmp/ansible-tmp-1726853302.131627-14273-175335695615081/AnsiballZ_stat.py <<< 13273 1726853302.19892: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853302.131627-14273-175335695615081/AnsiballZ_stat.py" <<< 13273 1726853302.19940: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13273pukop8ph/tmpj2h5pbm2" to remote "/root/.ansible/tmp/ansible-tmp-1726853302.131627-14273-175335695615081/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853302.131627-14273-175335695615081/AnsiballZ_stat.py" <<< 13273 1726853302.20818: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853302.20822: stdout chunk (state=3): >>><<< 13273 1726853302.20828: stderr chunk (state=3): >>><<< 13273 1726853302.20929: done transferring module to remote 13273 1726853302.20933: _low_level_execute_command(): starting 13273 1726853302.20935: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853302.131627-14273-175335695615081/ /root/.ansible/tmp/ansible-tmp-1726853302.131627-14273-175335695615081/AnsiballZ_stat.py && sleep 0' 13273 1726853302.21579: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13273 1726853302.21695: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853302.21719: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853302.21737: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853302.21830: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853302.23721: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853302.23731: stdout chunk (state=3): >>><<< 13273 1726853302.23748: stderr chunk (state=3): >>><<< 13273 1726853302.23767: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853302.23778: _low_level_execute_command(): starting 13273 1726853302.23795: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853302.131627-14273-175335695615081/AnsiballZ_stat.py && sleep 0' 13273 1726853302.24391: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853302.24403: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853302.24421: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13273 1726853302.24436: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 13273 1726853302.24448: stderr chunk (state=3): >>>debug2: match not found <<< 13273 1726853302.24502: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853302.24559: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853302.24596: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853302.24620: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853302.24720: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853302.40441: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0.0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 13273 1726853302.42104: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 13273 1726853302.42107: stdout chunk (state=3): >>><<< 13273 1726853302.42110: stderr chunk (state=3): >>><<< 13273 1726853302.42410: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0.0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 13273 1726853302.42414: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-bond0.0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853302.131627-14273-175335695615081/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13273 1726853302.42417: _low_level_execute_command(): starting 13273 1726853302.42419: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853302.131627-14273-175335695615081/ > /dev/null 2>&1 && sleep 0' 13273 1726853302.43478: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853302.43492: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853302.43674: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853302.43682: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853302.43689: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853302.43804: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853302.45654: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853302.45665: stderr chunk (state=3): >>><<< 13273 1726853302.45862: stdout chunk (state=3): >>><<< 13273 1726853302.45866: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853302.45868: handler run complete 13273 1726853302.45872: attempt loop complete, returning result 13273 1726853302.45875: _execute() done 13273 1726853302.45876: dumping result to json 13273 1726853302.45878: done dumping result, returning 13273 1726853302.45880: done running TaskExecutor() for managed_node3/TASK: Stat profile file [02083763-bbaf-5fc3-657d-00000000062d] 13273 1726853302.45882: sending task result for task 02083763-bbaf-5fc3-657d-00000000062d ok: [managed_node3] => { "changed": false, "stat": { "exists": false } } 13273 1726853302.46026: no more pending results, returning what we have 13273 1726853302.46030: results queue empty 13273 1726853302.46031: checking for any_errors_fatal 13273 1726853302.46038: done checking for any_errors_fatal 13273 1726853302.46038: checking for max_fail_percentage 13273 1726853302.46041: done checking for max_fail_percentage 13273 1726853302.46041: checking to see if all hosts have failed and the running result is not ok 13273 1726853302.46045: done checking to see if all hosts have failed 13273 1726853302.46045: getting the remaining hosts for this loop 13273 1726853302.46047: done getting the remaining hosts for this loop 13273 1726853302.46051: getting the next task for host managed_node3 13273 1726853302.46057: done getting next task for host managed_node3 13273 1726853302.46060: ^ task is: TASK: Set NM profile exist flag based on the profile files 13273 1726853302.46064: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853302.46067: getting variables 13273 1726853302.46069: in VariableManager get_vars() 13273 1726853302.46194: Calling all_inventory to load vars for managed_node3 13273 1726853302.46197: Calling groups_inventory to load vars for managed_node3 13273 1726853302.46199: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853302.46205: done sending task result for task 02083763-bbaf-5fc3-657d-00000000062d 13273 1726853302.46207: WORKER PROCESS EXITING 13273 1726853302.46216: Calling all_plugins_play to load vars for managed_node3 13273 1726853302.46220: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853302.46222: Calling groups_plugins_play to load vars for managed_node3 13273 1726853302.49600: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853302.53245: done with get_vars() 13273 1726853302.53267: done getting variables 13273 1726853302.53374: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Friday 20 September 2024 13:28:22 -0400 (0:00:00.455) 0:00:20.423 ****** 13273 1726853302.53408: entering _queue_task() for managed_node3/set_fact 13273 1726853302.54187: worker is 1 (out of 1 available) 13273 1726853302.54204: exiting _queue_task() for managed_node3/set_fact 13273 1726853302.54219: done queuing things up, now waiting for results queue to drain 13273 1726853302.54220: waiting for pending results... 13273 1726853302.54766: running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag based on the profile files 13273 1726853302.54892: in run() - task 02083763-bbaf-5fc3-657d-00000000062e 13273 1726853302.54905: variable 'ansible_search_path' from source: unknown 13273 1726853302.54908: variable 'ansible_search_path' from source: unknown 13273 1726853302.54948: calling self._execute() 13273 1726853302.55040: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853302.55048: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853302.55061: variable 'omit' from source: magic vars 13273 1726853302.55901: variable 'ansible_distribution_major_version' from source: facts 13273 1726853302.55913: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853302.56330: variable 'profile_stat' from source: set_fact 13273 1726853302.56341: Evaluated conditional (profile_stat.stat.exists): False 13273 1726853302.56347: when evaluation is False, skipping this task 13273 1726853302.56350: _execute() done 13273 1726853302.56352: dumping result to json 13273 1726853302.56354: done dumping result, returning 13273 1726853302.56357: done running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag based on the profile files [02083763-bbaf-5fc3-657d-00000000062e] 13273 1726853302.56414: sending task result for task 02083763-bbaf-5fc3-657d-00000000062e 13273 1726853302.56478: done sending task result for task 02083763-bbaf-5fc3-657d-00000000062e 13273 1726853302.56482: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 13273 1726853302.56545: no more pending results, returning what we have 13273 1726853302.56549: results queue empty 13273 1726853302.56551: checking for any_errors_fatal 13273 1726853302.56559: done checking for any_errors_fatal 13273 1726853302.56560: checking for max_fail_percentage 13273 1726853302.56562: done checking for max_fail_percentage 13273 1726853302.56563: checking to see if all hosts have failed and the running result is not ok 13273 1726853302.56563: done checking to see if all hosts have failed 13273 1726853302.56564: getting the remaining hosts for this loop 13273 1726853302.56566: done getting the remaining hosts for this loop 13273 1726853302.56569: getting the next task for host managed_node3 13273 1726853302.56578: done getting next task for host managed_node3 13273 1726853302.56581: ^ task is: TASK: Get NM profile info 13273 1726853302.56585: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853302.56591: getting variables 13273 1726853302.56593: in VariableManager get_vars() 13273 1726853302.56654: Calling all_inventory to load vars for managed_node3 13273 1726853302.56657: Calling groups_inventory to load vars for managed_node3 13273 1726853302.56659: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853302.56846: Calling all_plugins_play to load vars for managed_node3 13273 1726853302.56850: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853302.56854: Calling groups_plugins_play to load vars for managed_node3 13273 1726853302.59868: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853302.63267: done with get_vars() 13273 1726853302.63299: done getting variables 13273 1726853302.63360: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Friday 20 September 2024 13:28:22 -0400 (0:00:00.099) 0:00:20.523 ****** 13273 1726853302.63407: entering _queue_task() for managed_node3/shell 13273 1726853302.63756: worker is 1 (out of 1 available) 13273 1726853302.63768: exiting _queue_task() for managed_node3/shell 13273 1726853302.63783: done queuing things up, now waiting for results queue to drain 13273 1726853302.63784: waiting for pending results... 13273 1726853302.64137: running TaskExecutor() for managed_node3/TASK: Get NM profile info 13273 1726853302.64289: in run() - task 02083763-bbaf-5fc3-657d-00000000062f 13273 1726853302.64295: variable 'ansible_search_path' from source: unknown 13273 1726853302.64297: variable 'ansible_search_path' from source: unknown 13273 1726853302.64301: calling self._execute() 13273 1726853302.64392: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853302.64410: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853302.64425: variable 'omit' from source: magic vars 13273 1726853302.64854: variable 'ansible_distribution_major_version' from source: facts 13273 1726853302.64947: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853302.64951: variable 'omit' from source: magic vars 13273 1726853302.64953: variable 'omit' from source: magic vars 13273 1726853302.65053: variable 'profile' from source: include params 13273 1726853302.65064: variable 'item' from source: include params 13273 1726853302.65137: variable 'item' from source: include params 13273 1726853302.65166: variable 'omit' from source: magic vars 13273 1726853302.65213: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13273 1726853302.65255: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13273 1726853302.65288: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13273 1726853302.65313: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853302.65330: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853302.65366: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13273 1726853302.65381: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853302.65478: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853302.65496: Set connection var ansible_connection to ssh 13273 1726853302.65512: Set connection var ansible_module_compression to ZIP_DEFLATED 13273 1726853302.65812: Set connection var ansible_shell_executable to /bin/sh 13273 1726853302.65816: Set connection var ansible_shell_type to sh 13273 1726853302.65821: Set connection var ansible_pipelining to False 13273 1726853302.65824: Set connection var ansible_timeout to 10 13273 1726853302.65827: variable 'ansible_shell_executable' from source: unknown 13273 1726853302.65830: variable 'ansible_connection' from source: unknown 13273 1726853302.65833: variable 'ansible_module_compression' from source: unknown 13273 1726853302.65835: variable 'ansible_shell_type' from source: unknown 13273 1726853302.65838: variable 'ansible_shell_executable' from source: unknown 13273 1726853302.65841: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853302.65843: variable 'ansible_pipelining' from source: unknown 13273 1726853302.65846: variable 'ansible_timeout' from source: unknown 13273 1726853302.65848: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853302.66040: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13273 1726853302.66058: variable 'omit' from source: magic vars 13273 1726853302.66133: starting attempt loop 13273 1726853302.66141: running the handler 13273 1726853302.66156: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13273 1726853302.66181: _low_level_execute_command(): starting 13273 1726853302.66191: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13273 1726853302.67257: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13273 1726853302.67275: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853302.67291: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853302.67314: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13273 1726853302.67333: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 13273 1726853302.67432: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853302.67446: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853302.67617: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853302.69318: stdout chunk (state=3): >>>/root <<< 13273 1726853302.69710: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853302.69714: stdout chunk (state=3): >>><<< 13273 1726853302.69716: stderr chunk (state=3): >>><<< 13273 1726853302.69823: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853302.69827: _low_level_execute_command(): starting 13273 1726853302.69831: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853302.6973991-14290-206072844926795 `" && echo ansible-tmp-1726853302.6973991-14290-206072844926795="` echo /root/.ansible/tmp/ansible-tmp-1726853302.6973991-14290-206072844926795 `" ) && sleep 0' 13273 1726853302.71326: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13273 1726853302.71576: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853302.71709: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853302.71805: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853302.73818: stdout chunk (state=3): >>>ansible-tmp-1726853302.6973991-14290-206072844926795=/root/.ansible/tmp/ansible-tmp-1726853302.6973991-14290-206072844926795 <<< 13273 1726853302.73977: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853302.73980: stdout chunk (state=3): >>><<< 13273 1726853302.73983: stderr chunk (state=3): >>><<< 13273 1726853302.73985: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853302.6973991-14290-206072844926795=/root/.ansible/tmp/ansible-tmp-1726853302.6973991-14290-206072844926795 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853302.74014: variable 'ansible_module_compression' from source: unknown 13273 1726853302.74099: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13273pukop8ph/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 13273 1726853302.74109: variable 'ansible_facts' from source: unknown 13273 1726853302.74212: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853302.6973991-14290-206072844926795/AnsiballZ_command.py 13273 1726853302.74548: Sending initial data 13273 1726853302.74554: Sent initial data (156 bytes) 13273 1726853302.75541: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853302.75788: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853302.75994: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853302.77611: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13273 1726853302.77666: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13273 1726853302.77727: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13273pukop8ph/tmp0knjn8q7 /root/.ansible/tmp/ansible-tmp-1726853302.6973991-14290-206072844926795/AnsiballZ_command.py <<< 13273 1726853302.77730: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853302.6973991-14290-206072844926795/AnsiballZ_command.py" <<< 13273 1726853302.77822: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13273pukop8ph/tmp0knjn8q7" to remote "/root/.ansible/tmp/ansible-tmp-1726853302.6973991-14290-206072844926795/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853302.6973991-14290-206072844926795/AnsiballZ_command.py" <<< 13273 1726853302.78840: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853302.78847: stdout chunk (state=3): >>><<< 13273 1726853302.78850: stderr chunk (state=3): >>><<< 13273 1726853302.78852: done transferring module to remote 13273 1726853302.78854: _low_level_execute_command(): starting 13273 1726853302.78857: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853302.6973991-14290-206072844926795/ /root/.ansible/tmp/ansible-tmp-1726853302.6973991-14290-206072844926795/AnsiballZ_command.py && sleep 0' 13273 1726853302.79404: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13273 1726853302.79417: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853302.79430: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853302.79447: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13273 1726853302.79462: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 13273 1726853302.79477: stderr chunk (state=3): >>>debug2: match not found <<< 13273 1726853302.79521: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853302.79626: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853302.79656: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853302.79667: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853302.79761: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853302.81652: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853302.81655: stdout chunk (state=3): >>><<< 13273 1726853302.81658: stderr chunk (state=3): >>><<< 13273 1726853302.81759: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853302.81762: _low_level_execute_command(): starting 13273 1726853302.81767: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853302.6973991-14290-206072844926795/AnsiballZ_command.py && sleep 0' 13273 1726853302.82388: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13273 1726853302.82403: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853302.82447: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853302.82470: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13273 1726853302.82496: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 13273 1726853302.82549: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853302.82630: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853302.82733: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853302.82945: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853303.00600: stdout chunk (state=3): >>> {"changed": true, "stdout": "bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "start": "2024-09-20 13:28:22.983383", "end": "2024-09-20 13:28:23.004735", "delta": "0:00:00.021352", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 13273 1726853303.02422: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 13273 1726853303.02435: stdout chunk (state=3): >>><<< 13273 1726853303.02461: stderr chunk (state=3): >>><<< 13273 1726853303.02488: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "start": "2024-09-20 13:28:22.983383", "end": "2024-09-20 13:28:23.004735", "delta": "0:00:00.021352", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 13273 1726853303.02622: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853302.6973991-14290-206072844926795/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13273 1726853303.02626: _low_level_execute_command(): starting 13273 1726853303.02628: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853302.6973991-14290-206072844926795/ > /dev/null 2>&1 && sleep 0' 13273 1726853303.03252: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13273 1726853303.03285: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 13273 1726853303.03393: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853303.03414: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853303.03438: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853303.03547: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853303.05482: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853303.05485: stdout chunk (state=3): >>><<< 13273 1726853303.05488: stderr chunk (state=3): >>><<< 13273 1726853303.05576: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853303.05580: handler run complete 13273 1726853303.05582: Evaluated conditional (False): False 13273 1726853303.05584: attempt loop complete, returning result 13273 1726853303.05586: _execute() done 13273 1726853303.05588: dumping result to json 13273 1726853303.05590: done dumping result, returning 13273 1726853303.05592: done running TaskExecutor() for managed_node3/TASK: Get NM profile info [02083763-bbaf-5fc3-657d-00000000062f] 13273 1726853303.05599: sending task result for task 02083763-bbaf-5fc3-657d-00000000062f ok: [managed_node3] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "delta": "0:00:00.021352", "end": "2024-09-20 13:28:23.004735", "rc": 0, "start": "2024-09-20 13:28:22.983383" } STDOUT: bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection 13273 1726853303.05855: no more pending results, returning what we have 13273 1726853303.05858: results queue empty 13273 1726853303.05859: checking for any_errors_fatal 13273 1726853303.05864: done checking for any_errors_fatal 13273 1726853303.05865: checking for max_fail_percentage 13273 1726853303.05867: done checking for max_fail_percentage 13273 1726853303.05868: checking to see if all hosts have failed and the running result is not ok 13273 1726853303.05868: done checking to see if all hosts have failed 13273 1726853303.05869: getting the remaining hosts for this loop 13273 1726853303.05873: done getting the remaining hosts for this loop 13273 1726853303.05876: getting the next task for host managed_node3 13273 1726853303.05979: done getting next task for host managed_node3 13273 1726853303.05982: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 13273 1726853303.05986: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853303.05998: getting variables 13273 1726853303.06000: in VariableManager get_vars() 13273 1726853303.06059: Calling all_inventory to load vars for managed_node3 13273 1726853303.06062: Calling groups_inventory to load vars for managed_node3 13273 1726853303.06065: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853303.06187: Calling all_plugins_play to load vars for managed_node3 13273 1726853303.06191: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853303.06195: Calling groups_plugins_play to load vars for managed_node3 13273 1726853303.06796: done sending task result for task 02083763-bbaf-5fc3-657d-00000000062f 13273 1726853303.06799: WORKER PROCESS EXITING 13273 1726853303.07940: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853303.09594: done with get_vars() 13273 1726853303.09617: done getting variables 13273 1726853303.09691: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Friday 20 September 2024 13:28:23 -0400 (0:00:00.463) 0:00:20.986 ****** 13273 1726853303.09722: entering _queue_task() for managed_node3/set_fact 13273 1726853303.10091: worker is 1 (out of 1 available) 13273 1726853303.10109: exiting _queue_task() for managed_node3/set_fact 13273 1726853303.10122: done queuing things up, now waiting for results queue to drain 13273 1726853303.10123: waiting for pending results... 13273 1726853303.10359: running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 13273 1726853303.10486: in run() - task 02083763-bbaf-5fc3-657d-000000000630 13273 1726853303.10511: variable 'ansible_search_path' from source: unknown 13273 1726853303.10519: variable 'ansible_search_path' from source: unknown 13273 1726853303.10561: calling self._execute() 13273 1726853303.10662: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853303.10677: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853303.10694: variable 'omit' from source: magic vars 13273 1726853303.11055: variable 'ansible_distribution_major_version' from source: facts 13273 1726853303.11075: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853303.11205: variable 'nm_profile_exists' from source: set_fact 13273 1726853303.11226: Evaluated conditional (nm_profile_exists.rc == 0): True 13273 1726853303.11238: variable 'omit' from source: magic vars 13273 1726853303.11289: variable 'omit' from source: magic vars 13273 1726853303.11325: variable 'omit' from source: magic vars 13273 1726853303.11370: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13273 1726853303.11578: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13273 1726853303.11581: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13273 1726853303.11584: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853303.11586: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853303.11589: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13273 1726853303.11591: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853303.11593: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853303.11604: Set connection var ansible_connection to ssh 13273 1726853303.11617: Set connection var ansible_module_compression to ZIP_DEFLATED 13273 1726853303.11625: Set connection var ansible_shell_executable to /bin/sh 13273 1726853303.11631: Set connection var ansible_shell_type to sh 13273 1726853303.11639: Set connection var ansible_pipelining to False 13273 1726853303.11648: Set connection var ansible_timeout to 10 13273 1726853303.11678: variable 'ansible_shell_executable' from source: unknown 13273 1726853303.11687: variable 'ansible_connection' from source: unknown 13273 1726853303.11693: variable 'ansible_module_compression' from source: unknown 13273 1726853303.11699: variable 'ansible_shell_type' from source: unknown 13273 1726853303.11705: variable 'ansible_shell_executable' from source: unknown 13273 1726853303.11710: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853303.11716: variable 'ansible_pipelining' from source: unknown 13273 1726853303.11723: variable 'ansible_timeout' from source: unknown 13273 1726853303.11730: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853303.11861: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13273 1726853303.11879: variable 'omit' from source: magic vars 13273 1726853303.11890: starting attempt loop 13273 1726853303.11897: running the handler 13273 1726853303.11912: handler run complete 13273 1726853303.11925: attempt loop complete, returning result 13273 1726853303.11931: _execute() done 13273 1726853303.11937: dumping result to json 13273 1726853303.11943: done dumping result, returning 13273 1726853303.11954: done running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [02083763-bbaf-5fc3-657d-000000000630] 13273 1726853303.11963: sending task result for task 02083763-bbaf-5fc3-657d-000000000630 ok: [managed_node3] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 13273 1726853303.12103: no more pending results, returning what we have 13273 1726853303.12106: results queue empty 13273 1726853303.12107: checking for any_errors_fatal 13273 1726853303.12113: done checking for any_errors_fatal 13273 1726853303.12114: checking for max_fail_percentage 13273 1726853303.12115: done checking for max_fail_percentage 13273 1726853303.12116: checking to see if all hosts have failed and the running result is not ok 13273 1726853303.12116: done checking to see if all hosts have failed 13273 1726853303.12117: getting the remaining hosts for this loop 13273 1726853303.12118: done getting the remaining hosts for this loop 13273 1726853303.12121: getting the next task for host managed_node3 13273 1726853303.12129: done getting next task for host managed_node3 13273 1726853303.12131: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 13273 1726853303.12135: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853303.12139: getting variables 13273 1726853303.12141: in VariableManager get_vars() 13273 1726853303.12197: Calling all_inventory to load vars for managed_node3 13273 1726853303.12199: Calling groups_inventory to load vars for managed_node3 13273 1726853303.12201: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853303.12213: Calling all_plugins_play to load vars for managed_node3 13273 1726853303.12215: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853303.12219: Calling groups_plugins_play to load vars for managed_node3 13273 1726853303.12740: done sending task result for task 02083763-bbaf-5fc3-657d-000000000630 13273 1726853303.12744: WORKER PROCESS EXITING 13273 1726853303.13555: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853303.15079: done with get_vars() 13273 1726853303.15102: done getting variables 13273 1726853303.15164: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 13273 1726853303.15288: variable 'profile' from source: include params 13273 1726853303.15292: variable 'item' from source: include params 13273 1726853303.15350: variable 'item' from source: include params TASK [Get the ansible_managed comment in ifcfg-bond0.0] ************************ task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Friday 20 September 2024 13:28:23 -0400 (0:00:00.056) 0:00:21.043 ****** 13273 1726853303.15387: entering _queue_task() for managed_node3/command 13273 1726853303.15803: worker is 1 (out of 1 available) 13273 1726853303.15815: exiting _queue_task() for managed_node3/command 13273 1726853303.15827: done queuing things up, now waiting for results queue to drain 13273 1726853303.15828: waiting for pending results... 13273 1726853303.16015: running TaskExecutor() for managed_node3/TASK: Get the ansible_managed comment in ifcfg-bond0.0 13273 1726853303.16136: in run() - task 02083763-bbaf-5fc3-657d-000000000632 13273 1726853303.16157: variable 'ansible_search_path' from source: unknown 13273 1726853303.16167: variable 'ansible_search_path' from source: unknown 13273 1726853303.16207: calling self._execute() 13273 1726853303.16310: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853303.16323: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853303.16335: variable 'omit' from source: magic vars 13273 1726853303.16692: variable 'ansible_distribution_major_version' from source: facts 13273 1726853303.16712: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853303.16835: variable 'profile_stat' from source: set_fact 13273 1726853303.16853: Evaluated conditional (profile_stat.stat.exists): False 13273 1726853303.16918: when evaluation is False, skipping this task 13273 1726853303.16922: _execute() done 13273 1726853303.16925: dumping result to json 13273 1726853303.16927: done dumping result, returning 13273 1726853303.16930: done running TaskExecutor() for managed_node3/TASK: Get the ansible_managed comment in ifcfg-bond0.0 [02083763-bbaf-5fc3-657d-000000000632] 13273 1726853303.16932: sending task result for task 02083763-bbaf-5fc3-657d-000000000632 13273 1726853303.16997: done sending task result for task 02083763-bbaf-5fc3-657d-000000000632 13273 1726853303.17000: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 13273 1726853303.17074: no more pending results, returning what we have 13273 1726853303.17078: results queue empty 13273 1726853303.17080: checking for any_errors_fatal 13273 1726853303.17086: done checking for any_errors_fatal 13273 1726853303.17086: checking for max_fail_percentage 13273 1726853303.17088: done checking for max_fail_percentage 13273 1726853303.17089: checking to see if all hosts have failed and the running result is not ok 13273 1726853303.17090: done checking to see if all hosts have failed 13273 1726853303.17090: getting the remaining hosts for this loop 13273 1726853303.17092: done getting the remaining hosts for this loop 13273 1726853303.17095: getting the next task for host managed_node3 13273 1726853303.17102: done getting next task for host managed_node3 13273 1726853303.17104: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 13273 1726853303.17108: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853303.17113: getting variables 13273 1726853303.17115: in VariableManager get_vars() 13273 1726853303.17420: Calling all_inventory to load vars for managed_node3 13273 1726853303.17423: Calling groups_inventory to load vars for managed_node3 13273 1726853303.17426: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853303.17435: Calling all_plugins_play to load vars for managed_node3 13273 1726853303.17438: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853303.17441: Calling groups_plugins_play to load vars for managed_node3 13273 1726853303.18985: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853303.22092: done with get_vars() 13273 1726853303.22116: done getting variables 13273 1726853303.22174: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 13273 1726853303.22482: variable 'profile' from source: include params 13273 1726853303.22486: variable 'item' from source: include params 13273 1726853303.22542: variable 'item' from source: include params TASK [Verify the ansible_managed comment in ifcfg-bond0.0] ********************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Friday 20 September 2024 13:28:23 -0400 (0:00:00.071) 0:00:21.114 ****** 13273 1726853303.22574: entering _queue_task() for managed_node3/set_fact 13273 1726853303.23311: worker is 1 (out of 1 available) 13273 1726853303.23322: exiting _queue_task() for managed_node3/set_fact 13273 1726853303.23333: done queuing things up, now waiting for results queue to drain 13273 1726853303.23334: waiting for pending results... 13273 1726853303.23790: running TaskExecutor() for managed_node3/TASK: Verify the ansible_managed comment in ifcfg-bond0.0 13273 1726853303.23794: in run() - task 02083763-bbaf-5fc3-657d-000000000633 13273 1726853303.23986: variable 'ansible_search_path' from source: unknown 13273 1726853303.23991: variable 'ansible_search_path' from source: unknown 13273 1726853303.24024: calling self._execute() 13273 1726853303.24121: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853303.24177: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853303.24181: variable 'omit' from source: magic vars 13273 1726853303.24924: variable 'ansible_distribution_major_version' from source: facts 13273 1726853303.24935: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853303.25176: variable 'profile_stat' from source: set_fact 13273 1726853303.25180: Evaluated conditional (profile_stat.stat.exists): False 13273 1726853303.25182: when evaluation is False, skipping this task 13273 1726853303.25184: _execute() done 13273 1726853303.25185: dumping result to json 13273 1726853303.25191: done dumping result, returning 13273 1726853303.25196: done running TaskExecutor() for managed_node3/TASK: Verify the ansible_managed comment in ifcfg-bond0.0 [02083763-bbaf-5fc3-657d-000000000633] 13273 1726853303.25198: sending task result for task 02083763-bbaf-5fc3-657d-000000000633 13273 1726853303.25257: done sending task result for task 02083763-bbaf-5fc3-657d-000000000633 13273 1726853303.25260: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 13273 1726853303.25343: no more pending results, returning what we have 13273 1726853303.25347: results queue empty 13273 1726853303.25348: checking for any_errors_fatal 13273 1726853303.25353: done checking for any_errors_fatal 13273 1726853303.25354: checking for max_fail_percentage 13273 1726853303.25355: done checking for max_fail_percentage 13273 1726853303.25356: checking to see if all hosts have failed and the running result is not ok 13273 1726853303.25356: done checking to see if all hosts have failed 13273 1726853303.25357: getting the remaining hosts for this loop 13273 1726853303.25358: done getting the remaining hosts for this loop 13273 1726853303.25362: getting the next task for host managed_node3 13273 1726853303.25367: done getting next task for host managed_node3 13273 1726853303.25369: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 13273 1726853303.25374: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853303.25378: getting variables 13273 1726853303.25379: in VariableManager get_vars() 13273 1726853303.25422: Calling all_inventory to load vars for managed_node3 13273 1726853303.25424: Calling groups_inventory to load vars for managed_node3 13273 1726853303.25426: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853303.25436: Calling all_plugins_play to load vars for managed_node3 13273 1726853303.25438: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853303.25441: Calling groups_plugins_play to load vars for managed_node3 13273 1726853303.28172: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853303.31534: done with get_vars() 13273 1726853303.31565: done getting variables 13273 1726853303.31730: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 13273 1726853303.31955: variable 'profile' from source: include params 13273 1726853303.31959: variable 'item' from source: include params 13273 1726853303.32200: variable 'item' from source: include params TASK [Get the fingerprint comment in ifcfg-bond0.0] **************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Friday 20 September 2024 13:28:23 -0400 (0:00:00.096) 0:00:21.211 ****** 13273 1726853303.32280: entering _queue_task() for managed_node3/command 13273 1726853303.32963: worker is 1 (out of 1 available) 13273 1726853303.32977: exiting _queue_task() for managed_node3/command 13273 1726853303.33102: done queuing things up, now waiting for results queue to drain 13273 1726853303.33103: waiting for pending results... 13273 1726853303.33564: running TaskExecutor() for managed_node3/TASK: Get the fingerprint comment in ifcfg-bond0.0 13273 1726853303.33570: in run() - task 02083763-bbaf-5fc3-657d-000000000634 13273 1726853303.33689: variable 'ansible_search_path' from source: unknown 13273 1726853303.33693: variable 'ansible_search_path' from source: unknown 13273 1726853303.33726: calling self._execute() 13273 1726853303.33824: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853303.33831: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853303.33840: variable 'omit' from source: magic vars 13273 1726853303.34522: variable 'ansible_distribution_major_version' from source: facts 13273 1726853303.34526: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853303.34683: variable 'profile_stat' from source: set_fact 13273 1726853303.34687: Evaluated conditional (profile_stat.stat.exists): False 13273 1726853303.34689: when evaluation is False, skipping this task 13273 1726853303.34691: _execute() done 13273 1726853303.34693: dumping result to json 13273 1726853303.34695: done dumping result, returning 13273 1726853303.34697: done running TaskExecutor() for managed_node3/TASK: Get the fingerprint comment in ifcfg-bond0.0 [02083763-bbaf-5fc3-657d-000000000634] 13273 1726853303.34698: sending task result for task 02083763-bbaf-5fc3-657d-000000000634 13273 1726853303.34757: done sending task result for task 02083763-bbaf-5fc3-657d-000000000634 13273 1726853303.34760: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 13273 1726853303.34834: no more pending results, returning what we have 13273 1726853303.34839: results queue empty 13273 1726853303.34840: checking for any_errors_fatal 13273 1726853303.34846: done checking for any_errors_fatal 13273 1726853303.34847: checking for max_fail_percentage 13273 1726853303.34849: done checking for max_fail_percentage 13273 1726853303.34850: checking to see if all hosts have failed and the running result is not ok 13273 1726853303.34850: done checking to see if all hosts have failed 13273 1726853303.34851: getting the remaining hosts for this loop 13273 1726853303.34852: done getting the remaining hosts for this loop 13273 1726853303.34855: getting the next task for host managed_node3 13273 1726853303.34861: done getting next task for host managed_node3 13273 1726853303.34864: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 13273 1726853303.34867: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853303.34976: getting variables 13273 1726853303.34978: in VariableManager get_vars() 13273 1726853303.35024: Calling all_inventory to load vars for managed_node3 13273 1726853303.35027: Calling groups_inventory to load vars for managed_node3 13273 1726853303.35029: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853303.35038: Calling all_plugins_play to load vars for managed_node3 13273 1726853303.35041: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853303.35044: Calling groups_plugins_play to load vars for managed_node3 13273 1726853303.38683: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853303.41668: done with get_vars() 13273 1726853303.41694: done getting variables 13273 1726853303.41750: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 13273 1726853303.42060: variable 'profile' from source: include params 13273 1726853303.42064: variable 'item' from source: include params 13273 1726853303.42121: variable 'item' from source: include params TASK [Verify the fingerprint comment in ifcfg-bond0.0] ************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Friday 20 September 2024 13:28:23 -0400 (0:00:00.099) 0:00:21.310 ****** 13273 1726853303.42152: entering _queue_task() for managed_node3/set_fact 13273 1726853303.42869: worker is 1 (out of 1 available) 13273 1726853303.42883: exiting _queue_task() for managed_node3/set_fact 13273 1726853303.42895: done queuing things up, now waiting for results queue to drain 13273 1726853303.42896: waiting for pending results... 13273 1726853303.43490: running TaskExecutor() for managed_node3/TASK: Verify the fingerprint comment in ifcfg-bond0.0 13273 1726853303.43494: in run() - task 02083763-bbaf-5fc3-657d-000000000635 13273 1726853303.43498: variable 'ansible_search_path' from source: unknown 13273 1726853303.43500: variable 'ansible_search_path' from source: unknown 13273 1726853303.43693: calling self._execute() 13273 1726853303.43787: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853303.43794: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853303.43806: variable 'omit' from source: magic vars 13273 1726853303.44586: variable 'ansible_distribution_major_version' from source: facts 13273 1726853303.44597: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853303.44923: variable 'profile_stat' from source: set_fact 13273 1726853303.44935: Evaluated conditional (profile_stat.stat.exists): False 13273 1726853303.44938: when evaluation is False, skipping this task 13273 1726853303.44940: _execute() done 13273 1726853303.44943: dumping result to json 13273 1726853303.44976: done dumping result, returning 13273 1726853303.44980: done running TaskExecutor() for managed_node3/TASK: Verify the fingerprint comment in ifcfg-bond0.0 [02083763-bbaf-5fc3-657d-000000000635] 13273 1726853303.44982: sending task result for task 02083763-bbaf-5fc3-657d-000000000635 13273 1726853303.45222: done sending task result for task 02083763-bbaf-5fc3-657d-000000000635 13273 1726853303.45226: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 13273 1726853303.45275: no more pending results, returning what we have 13273 1726853303.45279: results queue empty 13273 1726853303.45280: checking for any_errors_fatal 13273 1726853303.45287: done checking for any_errors_fatal 13273 1726853303.45288: checking for max_fail_percentage 13273 1726853303.45290: done checking for max_fail_percentage 13273 1726853303.45291: checking to see if all hosts have failed and the running result is not ok 13273 1726853303.45292: done checking to see if all hosts have failed 13273 1726853303.45292: getting the remaining hosts for this loop 13273 1726853303.45294: done getting the remaining hosts for this loop 13273 1726853303.45297: getting the next task for host managed_node3 13273 1726853303.45304: done getting next task for host managed_node3 13273 1726853303.45307: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 13273 1726853303.45310: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853303.45315: getting variables 13273 1726853303.45316: in VariableManager get_vars() 13273 1726853303.45373: Calling all_inventory to load vars for managed_node3 13273 1726853303.45377: Calling groups_inventory to load vars for managed_node3 13273 1726853303.45379: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853303.45389: Calling all_plugins_play to load vars for managed_node3 13273 1726853303.45392: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853303.45394: Calling groups_plugins_play to load vars for managed_node3 13273 1726853303.48579: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853303.51989: done with get_vars() 13273 1726853303.52013: done getting variables 13273 1726853303.52078: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 13273 1726853303.52496: variable 'profile' from source: include params 13273 1726853303.52500: variable 'item' from source: include params 13273 1726853303.52557: variable 'item' from source: include params TASK [Assert that the profile is present - 'bond0.0'] ************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Friday 20 September 2024 13:28:23 -0400 (0:00:00.104) 0:00:21.415 ****** 13273 1726853303.52590: entering _queue_task() for managed_node3/assert 13273 1726853303.53336: worker is 1 (out of 1 available) 13273 1726853303.53349: exiting _queue_task() for managed_node3/assert 13273 1726853303.53364: done queuing things up, now waiting for results queue to drain 13273 1726853303.53365: waiting for pending results... 13273 1726853303.53931: running TaskExecutor() for managed_node3/TASK: Assert that the profile is present - 'bond0.0' 13273 1726853303.54163: in run() - task 02083763-bbaf-5fc3-657d-00000000035d 13273 1726853303.54244: variable 'ansible_search_path' from source: unknown 13273 1726853303.54248: variable 'ansible_search_path' from source: unknown 13273 1726853303.54251: calling self._execute() 13273 1726853303.54432: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853303.54439: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853303.54452: variable 'omit' from source: magic vars 13273 1726853303.55355: variable 'ansible_distribution_major_version' from source: facts 13273 1726853303.55627: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853303.55631: variable 'omit' from source: magic vars 13273 1726853303.55634: variable 'omit' from source: magic vars 13273 1726853303.55673: variable 'profile' from source: include params 13273 1726853303.55677: variable 'item' from source: include params 13273 1726853303.55736: variable 'item' from source: include params 13273 1726853303.55869: variable 'omit' from source: magic vars 13273 1726853303.55914: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13273 1726853303.55953: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13273 1726853303.55972: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13273 1726853303.56111: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853303.56121: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853303.56154: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13273 1726853303.56158: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853303.56161: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853303.56400: Set connection var ansible_connection to ssh 13273 1726853303.56411: Set connection var ansible_module_compression to ZIP_DEFLATED 13273 1726853303.56490: Set connection var ansible_shell_executable to /bin/sh 13273 1726853303.56493: Set connection var ansible_shell_type to sh 13273 1726853303.56498: Set connection var ansible_pipelining to False 13273 1726853303.56505: Set connection var ansible_timeout to 10 13273 1726853303.56644: variable 'ansible_shell_executable' from source: unknown 13273 1726853303.56650: variable 'ansible_connection' from source: unknown 13273 1726853303.56653: variable 'ansible_module_compression' from source: unknown 13273 1726853303.56656: variable 'ansible_shell_type' from source: unknown 13273 1726853303.56658: variable 'ansible_shell_executable' from source: unknown 13273 1726853303.56660: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853303.56675: variable 'ansible_pipelining' from source: unknown 13273 1726853303.56678: variable 'ansible_timeout' from source: unknown 13273 1726853303.56680: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853303.57056: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13273 1726853303.57060: variable 'omit' from source: magic vars 13273 1726853303.57062: starting attempt loop 13273 1726853303.57065: running the handler 13273 1726853303.57210: variable 'lsr_net_profile_exists' from source: set_fact 13273 1726853303.57213: Evaluated conditional (lsr_net_profile_exists): True 13273 1726853303.57221: handler run complete 13273 1726853303.57235: attempt loop complete, returning result 13273 1726853303.57238: _execute() done 13273 1726853303.57240: dumping result to json 13273 1726853303.57243: done dumping result, returning 13273 1726853303.57253: done running TaskExecutor() for managed_node3/TASK: Assert that the profile is present - 'bond0.0' [02083763-bbaf-5fc3-657d-00000000035d] 13273 1726853303.57258: sending task result for task 02083763-bbaf-5fc3-657d-00000000035d 13273 1726853303.57458: done sending task result for task 02083763-bbaf-5fc3-657d-00000000035d 13273 1726853303.57461: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 13273 1726853303.57530: no more pending results, returning what we have 13273 1726853303.57534: results queue empty 13273 1726853303.57535: checking for any_errors_fatal 13273 1726853303.57541: done checking for any_errors_fatal 13273 1726853303.57541: checking for max_fail_percentage 13273 1726853303.57543: done checking for max_fail_percentage 13273 1726853303.57544: checking to see if all hosts have failed and the running result is not ok 13273 1726853303.57545: done checking to see if all hosts have failed 13273 1726853303.57545: getting the remaining hosts for this loop 13273 1726853303.57547: done getting the remaining hosts for this loop 13273 1726853303.57551: getting the next task for host managed_node3 13273 1726853303.57557: done getting next task for host managed_node3 13273 1726853303.57560: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 13273 1726853303.57564: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853303.57568: getting variables 13273 1726853303.57569: in VariableManager get_vars() 13273 1726853303.57722: Calling all_inventory to load vars for managed_node3 13273 1726853303.57725: Calling groups_inventory to load vars for managed_node3 13273 1726853303.57728: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853303.57738: Calling all_plugins_play to load vars for managed_node3 13273 1726853303.57741: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853303.57744: Calling groups_plugins_play to load vars for managed_node3 13273 1726853303.60418: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853303.62264: done with get_vars() 13273 1726853303.62292: done getting variables 13273 1726853303.62366: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 13273 1726853303.62488: variable 'profile' from source: include params 13273 1726853303.62491: variable 'item' from source: include params 13273 1726853303.62558: variable 'item' from source: include params TASK [Assert that the ansible managed comment is present in 'bond0.0'] ********* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Friday 20 September 2024 13:28:23 -0400 (0:00:00.100) 0:00:21.515 ****** 13273 1726853303.62601: entering _queue_task() for managed_node3/assert 13273 1726853303.62958: worker is 1 (out of 1 available) 13273 1726853303.63176: exiting _queue_task() for managed_node3/assert 13273 1726853303.63189: done queuing things up, now waiting for results queue to drain 13273 1726853303.63190: waiting for pending results... 13273 1726853303.63392: running TaskExecutor() for managed_node3/TASK: Assert that the ansible managed comment is present in 'bond0.0' 13273 1726853303.63426: in run() - task 02083763-bbaf-5fc3-657d-00000000035e 13273 1726853303.63449: variable 'ansible_search_path' from source: unknown 13273 1726853303.63464: variable 'ansible_search_path' from source: unknown 13273 1726853303.63509: calling self._execute() 13273 1726853303.63615: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853303.63634: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853303.63744: variable 'omit' from source: magic vars 13273 1726853303.64048: variable 'ansible_distribution_major_version' from source: facts 13273 1726853303.64076: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853303.64090: variable 'omit' from source: magic vars 13273 1726853303.64134: variable 'omit' from source: magic vars 13273 1726853303.64243: variable 'profile' from source: include params 13273 1726853303.64254: variable 'item' from source: include params 13273 1726853303.64324: variable 'item' from source: include params 13273 1726853303.64350: variable 'omit' from source: magic vars 13273 1726853303.64404: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13273 1726853303.64443: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13273 1726853303.64468: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13273 1726853303.64492: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853303.64516: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853303.64552: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13273 1726853303.64559: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853303.64565: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853303.64667: Set connection var ansible_connection to ssh 13273 1726853303.64721: Set connection var ansible_module_compression to ZIP_DEFLATED 13273 1726853303.64724: Set connection var ansible_shell_executable to /bin/sh 13273 1726853303.64726: Set connection var ansible_shell_type to sh 13273 1726853303.64728: Set connection var ansible_pipelining to False 13273 1726853303.64730: Set connection var ansible_timeout to 10 13273 1726853303.64741: variable 'ansible_shell_executable' from source: unknown 13273 1726853303.64748: variable 'ansible_connection' from source: unknown 13273 1726853303.64754: variable 'ansible_module_compression' from source: unknown 13273 1726853303.64760: variable 'ansible_shell_type' from source: unknown 13273 1726853303.64766: variable 'ansible_shell_executable' from source: unknown 13273 1726853303.64773: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853303.64831: variable 'ansible_pipelining' from source: unknown 13273 1726853303.64835: variable 'ansible_timeout' from source: unknown 13273 1726853303.64837: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853303.64929: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13273 1726853303.64949: variable 'omit' from source: magic vars 13273 1726853303.64958: starting attempt loop 13273 1726853303.64965: running the handler 13273 1726853303.65081: variable 'lsr_net_profile_ansible_managed' from source: set_fact 13273 1726853303.65092: Evaluated conditional (lsr_net_profile_ansible_managed): True 13273 1726853303.65102: handler run complete 13273 1726853303.65121: attempt loop complete, returning result 13273 1726853303.65157: _execute() done 13273 1726853303.65160: dumping result to json 13273 1726853303.65162: done dumping result, returning 13273 1726853303.65164: done running TaskExecutor() for managed_node3/TASK: Assert that the ansible managed comment is present in 'bond0.0' [02083763-bbaf-5fc3-657d-00000000035e] 13273 1726853303.65166: sending task result for task 02083763-bbaf-5fc3-657d-00000000035e ok: [managed_node3] => { "changed": false } MSG: All assertions passed 13273 1726853303.65308: no more pending results, returning what we have 13273 1726853303.65311: results queue empty 13273 1726853303.65312: checking for any_errors_fatal 13273 1726853303.65320: done checking for any_errors_fatal 13273 1726853303.65321: checking for max_fail_percentage 13273 1726853303.65323: done checking for max_fail_percentage 13273 1726853303.65324: checking to see if all hosts have failed and the running result is not ok 13273 1726853303.65324: done checking to see if all hosts have failed 13273 1726853303.65325: getting the remaining hosts for this loop 13273 1726853303.65327: done getting the remaining hosts for this loop 13273 1726853303.65329: getting the next task for host managed_node3 13273 1726853303.65335: done getting next task for host managed_node3 13273 1726853303.65337: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 13273 1726853303.65341: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853303.65346: getting variables 13273 1726853303.65347: in VariableManager get_vars() 13273 1726853303.65516: Calling all_inventory to load vars for managed_node3 13273 1726853303.65520: Calling groups_inventory to load vars for managed_node3 13273 1726853303.65523: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853303.65538: Calling all_plugins_play to load vars for managed_node3 13273 1726853303.65542: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853303.65545: Calling groups_plugins_play to load vars for managed_node3 13273 1726853303.66085: done sending task result for task 02083763-bbaf-5fc3-657d-00000000035e 13273 1726853303.66089: WORKER PROCESS EXITING 13273 1726853303.67457: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853303.69031: done with get_vars() 13273 1726853303.69053: done getting variables 13273 1726853303.69126: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 13273 1726853303.69241: variable 'profile' from source: include params 13273 1726853303.69245: variable 'item' from source: include params 13273 1726853303.69317: variable 'item' from source: include params TASK [Assert that the fingerprint comment is present in bond0.0] *************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Friday 20 September 2024 13:28:23 -0400 (0:00:00.067) 0:00:21.582 ****** 13273 1726853303.69363: entering _queue_task() for managed_node3/assert 13273 1726853303.69806: worker is 1 (out of 1 available) 13273 1726853303.69816: exiting _queue_task() for managed_node3/assert 13273 1726853303.69828: done queuing things up, now waiting for results queue to drain 13273 1726853303.69829: waiting for pending results... 13273 1726853303.70024: running TaskExecutor() for managed_node3/TASK: Assert that the fingerprint comment is present in bond0.0 13273 1726853303.70151: in run() - task 02083763-bbaf-5fc3-657d-00000000035f 13273 1726853303.70179: variable 'ansible_search_path' from source: unknown 13273 1726853303.70187: variable 'ansible_search_path' from source: unknown 13273 1726853303.70230: calling self._execute() 13273 1726853303.70336: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853303.70349: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853303.70367: variable 'omit' from source: magic vars 13273 1726853303.70759: variable 'ansible_distribution_major_version' from source: facts 13273 1726853303.70779: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853303.70795: variable 'omit' from source: magic vars 13273 1726853303.70920: variable 'omit' from source: magic vars 13273 1726853303.70951: variable 'profile' from source: include params 13273 1726853303.70961: variable 'item' from source: include params 13273 1726853303.71036: variable 'item' from source: include params 13273 1726853303.71060: variable 'omit' from source: magic vars 13273 1726853303.71109: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13273 1726853303.71160: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13273 1726853303.71187: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13273 1726853303.71209: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853303.71224: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853303.71266: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13273 1726853303.71355: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853303.71359: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853303.71391: Set connection var ansible_connection to ssh 13273 1726853303.71407: Set connection var ansible_module_compression to ZIP_DEFLATED 13273 1726853303.71417: Set connection var ansible_shell_executable to /bin/sh 13273 1726853303.71425: Set connection var ansible_shell_type to sh 13273 1726853303.71434: Set connection var ansible_pipelining to False 13273 1726853303.71443: Set connection var ansible_timeout to 10 13273 1726853303.71482: variable 'ansible_shell_executable' from source: unknown 13273 1726853303.71490: variable 'ansible_connection' from source: unknown 13273 1726853303.71574: variable 'ansible_module_compression' from source: unknown 13273 1726853303.71577: variable 'ansible_shell_type' from source: unknown 13273 1726853303.71579: variable 'ansible_shell_executable' from source: unknown 13273 1726853303.71582: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853303.71584: variable 'ansible_pipelining' from source: unknown 13273 1726853303.71586: variable 'ansible_timeout' from source: unknown 13273 1726853303.71588: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853303.71678: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13273 1726853303.71703: variable 'omit' from source: magic vars 13273 1726853303.71776: starting attempt loop 13273 1726853303.71779: running the handler 13273 1726853303.71835: variable 'lsr_net_profile_fingerprint' from source: set_fact 13273 1726853303.71845: Evaluated conditional (lsr_net_profile_fingerprint): True 13273 1726853303.71856: handler run complete 13273 1726853303.71879: attempt loop complete, returning result 13273 1726853303.71886: _execute() done 13273 1726853303.71898: dumping result to json 13273 1726853303.71905: done dumping result, returning 13273 1726853303.72002: done running TaskExecutor() for managed_node3/TASK: Assert that the fingerprint comment is present in bond0.0 [02083763-bbaf-5fc3-657d-00000000035f] 13273 1726853303.72005: sending task result for task 02083763-bbaf-5fc3-657d-00000000035f ok: [managed_node3] => { "changed": false } MSG: All assertions passed 13273 1726853303.72125: no more pending results, returning what we have 13273 1726853303.72129: results queue empty 13273 1726853303.72130: checking for any_errors_fatal 13273 1726853303.72137: done checking for any_errors_fatal 13273 1726853303.72138: checking for max_fail_percentage 13273 1726853303.72140: done checking for max_fail_percentage 13273 1726853303.72141: checking to see if all hosts have failed and the running result is not ok 13273 1726853303.72141: done checking to see if all hosts have failed 13273 1726853303.72142: getting the remaining hosts for this loop 13273 1726853303.72144: done getting the remaining hosts for this loop 13273 1726853303.72147: getting the next task for host managed_node3 13273 1726853303.72156: done getting next task for host managed_node3 13273 1726853303.72159: ^ task is: TASK: Include the task 'get_profile_stat.yml' 13273 1726853303.72163: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853303.72167: getting variables 13273 1726853303.72168: in VariableManager get_vars() 13273 1726853303.72226: Calling all_inventory to load vars for managed_node3 13273 1726853303.72229: Calling groups_inventory to load vars for managed_node3 13273 1726853303.72231: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853303.72242: Calling all_plugins_play to load vars for managed_node3 13273 1726853303.72245: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853303.72248: Calling groups_plugins_play to load vars for managed_node3 13273 1726853303.72784: done sending task result for task 02083763-bbaf-5fc3-657d-00000000035f 13273 1726853303.72787: WORKER PROCESS EXITING 13273 1726853303.75067: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853303.78244: done with get_vars() 13273 1726853303.78283: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Friday 20 September 2024 13:28:23 -0400 (0:00:00.092) 0:00:21.674 ****** 13273 1726853303.78568: entering _queue_task() for managed_node3/include_tasks 13273 1726853303.79446: worker is 1 (out of 1 available) 13273 1726853303.79459: exiting _queue_task() for managed_node3/include_tasks 13273 1726853303.79474: done queuing things up, now waiting for results queue to drain 13273 1726853303.79475: waiting for pending results... 13273 1726853303.80080: running TaskExecutor() for managed_node3/TASK: Include the task 'get_profile_stat.yml' 13273 1726853303.80555: in run() - task 02083763-bbaf-5fc3-657d-000000000363 13273 1726853303.80569: variable 'ansible_search_path' from source: unknown 13273 1726853303.80574: variable 'ansible_search_path' from source: unknown 13273 1726853303.80680: calling self._execute() 13273 1726853303.80942: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853303.80952: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853303.81080: variable 'omit' from source: magic vars 13273 1726853303.81889: variable 'ansible_distribution_major_version' from source: facts 13273 1726853303.81900: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853303.81912: _execute() done 13273 1726853303.81916: dumping result to json 13273 1726853303.81919: done dumping result, returning 13273 1726853303.81922: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_profile_stat.yml' [02083763-bbaf-5fc3-657d-000000000363] 13273 1726853303.81924: sending task result for task 02083763-bbaf-5fc3-657d-000000000363 13273 1726853303.82287: no more pending results, returning what we have 13273 1726853303.82291: in VariableManager get_vars() 13273 1726853303.82357: Calling all_inventory to load vars for managed_node3 13273 1726853303.82361: Calling groups_inventory to load vars for managed_node3 13273 1726853303.82363: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853303.82380: Calling all_plugins_play to load vars for managed_node3 13273 1726853303.82383: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853303.82387: Calling groups_plugins_play to load vars for managed_node3 13273 1726853303.83000: done sending task result for task 02083763-bbaf-5fc3-657d-000000000363 13273 1726853303.83005: WORKER PROCESS EXITING 13273 1726853303.85576: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853303.88807: done with get_vars() 13273 1726853303.88836: variable 'ansible_search_path' from source: unknown 13273 1726853303.88838: variable 'ansible_search_path' from source: unknown 13273 1726853303.88986: we have included files to process 13273 1726853303.88987: generating all_blocks data 13273 1726853303.88990: done generating all_blocks data 13273 1726853303.88995: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 13273 1726853303.89000: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 13273 1726853303.89003: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 13273 1726853303.90946: done processing included file 13273 1726853303.90948: iterating over new_blocks loaded from include file 13273 1726853303.90950: in VariableManager get_vars() 13273 1726853303.91098: done with get_vars() 13273 1726853303.91100: filtering new block on tags 13273 1726853303.91123: done filtering new block on tags 13273 1726853303.91127: in VariableManager get_vars() 13273 1726853303.91157: done with get_vars() 13273 1726853303.91159: filtering new block on tags 13273 1726853303.91182: done filtering new block on tags 13273 1726853303.91184: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node3 13273 1726853303.91277: extending task lists for all hosts with included blocks 13273 1726853303.91604: done extending task lists 13273 1726853303.91605: done processing included files 13273 1726853303.91606: results queue empty 13273 1726853303.91607: checking for any_errors_fatal 13273 1726853303.91610: done checking for any_errors_fatal 13273 1726853303.91610: checking for max_fail_percentage 13273 1726853303.91611: done checking for max_fail_percentage 13273 1726853303.91612: checking to see if all hosts have failed and the running result is not ok 13273 1726853303.91613: done checking to see if all hosts have failed 13273 1726853303.91613: getting the remaining hosts for this loop 13273 1726853303.91614: done getting the remaining hosts for this loop 13273 1726853303.91617: getting the next task for host managed_node3 13273 1726853303.91675: done getting next task for host managed_node3 13273 1726853303.91678: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 13273 1726853303.91681: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853303.91684: getting variables 13273 1726853303.91685: in VariableManager get_vars() 13273 1726853303.91702: Calling all_inventory to load vars for managed_node3 13273 1726853303.91705: Calling groups_inventory to load vars for managed_node3 13273 1726853303.91707: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853303.91712: Calling all_plugins_play to load vars for managed_node3 13273 1726853303.91714: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853303.91717: Calling groups_plugins_play to load vars for managed_node3 13273 1726853303.93893: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853303.95686: done with get_vars() 13273 1726853303.95708: done getting variables 13273 1726853303.95754: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Friday 20 September 2024 13:28:23 -0400 (0:00:00.173) 0:00:21.847 ****** 13273 1726853303.95879: entering _queue_task() for managed_node3/set_fact 13273 1726853303.96255: worker is 1 (out of 1 available) 13273 1726853303.96266: exiting _queue_task() for managed_node3/set_fact 13273 1726853303.96400: done queuing things up, now waiting for results queue to drain 13273 1726853303.96401: waiting for pending results... 13273 1726853303.96650: running TaskExecutor() for managed_node3/TASK: Initialize NM profile exist and ansible_managed comment flag 13273 1726853303.96747: in run() - task 02083763-bbaf-5fc3-657d-000000000674 13273 1726853303.96752: variable 'ansible_search_path' from source: unknown 13273 1726853303.96755: variable 'ansible_search_path' from source: unknown 13273 1726853303.96758: calling self._execute() 13273 1726853303.96959: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853303.96963: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853303.96966: variable 'omit' from source: magic vars 13273 1726853303.97306: variable 'ansible_distribution_major_version' from source: facts 13273 1726853303.97318: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853303.97324: variable 'omit' from source: magic vars 13273 1726853303.97392: variable 'omit' from source: magic vars 13273 1726853303.97436: variable 'omit' from source: magic vars 13273 1726853303.97492: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13273 1726853303.97540: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13273 1726853303.97560: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13273 1726853303.97580: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853303.97599: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853303.97663: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13273 1726853303.97667: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853303.97669: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853303.97748: Set connection var ansible_connection to ssh 13273 1726853303.97757: Set connection var ansible_module_compression to ZIP_DEFLATED 13273 1726853303.97776: Set connection var ansible_shell_executable to /bin/sh 13273 1726853303.97779: Set connection var ansible_shell_type to sh 13273 1726853303.97782: Set connection var ansible_pipelining to False 13273 1726853303.97783: Set connection var ansible_timeout to 10 13273 1726853303.97833: variable 'ansible_shell_executable' from source: unknown 13273 1726853303.97836: variable 'ansible_connection' from source: unknown 13273 1726853303.97840: variable 'ansible_module_compression' from source: unknown 13273 1726853303.97845: variable 'ansible_shell_type' from source: unknown 13273 1726853303.97847: variable 'ansible_shell_executable' from source: unknown 13273 1726853303.97849: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853303.97851: variable 'ansible_pipelining' from source: unknown 13273 1726853303.97853: variable 'ansible_timeout' from source: unknown 13273 1726853303.97856: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853303.97977: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13273 1726853303.98051: variable 'omit' from source: magic vars 13273 1726853303.98054: starting attempt loop 13273 1726853303.98057: running the handler 13273 1726853303.98059: handler run complete 13273 1726853303.98062: attempt loop complete, returning result 13273 1726853303.98064: _execute() done 13273 1726853303.98066: dumping result to json 13273 1726853303.98068: done dumping result, returning 13273 1726853303.98070: done running TaskExecutor() for managed_node3/TASK: Initialize NM profile exist and ansible_managed comment flag [02083763-bbaf-5fc3-657d-000000000674] 13273 1726853303.98074: sending task result for task 02083763-bbaf-5fc3-657d-000000000674 13273 1726853303.98130: done sending task result for task 02083763-bbaf-5fc3-657d-000000000674 13273 1726853303.98133: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 13273 1726853303.98208: no more pending results, returning what we have 13273 1726853303.98211: results queue empty 13273 1726853303.98213: checking for any_errors_fatal 13273 1726853303.98214: done checking for any_errors_fatal 13273 1726853303.98215: checking for max_fail_percentage 13273 1726853303.98217: done checking for max_fail_percentage 13273 1726853303.98218: checking to see if all hosts have failed and the running result is not ok 13273 1726853303.98218: done checking to see if all hosts have failed 13273 1726853303.98220: getting the remaining hosts for this loop 13273 1726853303.98222: done getting the remaining hosts for this loop 13273 1726853303.98225: getting the next task for host managed_node3 13273 1726853303.98233: done getting next task for host managed_node3 13273 1726853303.98236: ^ task is: TASK: Stat profile file 13273 1726853303.98242: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853303.98247: getting variables 13273 1726853303.98249: in VariableManager get_vars() 13273 1726853303.98306: Calling all_inventory to load vars for managed_node3 13273 1726853303.98309: Calling groups_inventory to load vars for managed_node3 13273 1726853303.98312: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853303.98324: Calling all_plugins_play to load vars for managed_node3 13273 1726853303.98328: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853303.98331: Calling groups_plugins_play to load vars for managed_node3 13273 1726853304.00266: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853304.02725: done with get_vars() 13273 1726853304.02755: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Friday 20 September 2024 13:28:24 -0400 (0:00:00.069) 0:00:21.917 ****** 13273 1726853304.02851: entering _queue_task() for managed_node3/stat 13273 1726853304.03203: worker is 1 (out of 1 available) 13273 1726853304.03216: exiting _queue_task() for managed_node3/stat 13273 1726853304.03229: done queuing things up, now waiting for results queue to drain 13273 1726853304.03230: waiting for pending results... 13273 1726853304.03581: running TaskExecutor() for managed_node3/TASK: Stat profile file 13273 1726853304.03585: in run() - task 02083763-bbaf-5fc3-657d-000000000675 13273 1726853304.03589: variable 'ansible_search_path' from source: unknown 13273 1726853304.03592: variable 'ansible_search_path' from source: unknown 13273 1726853304.03634: calling self._execute() 13273 1726853304.03727: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853304.03792: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853304.03796: variable 'omit' from source: magic vars 13273 1726853304.04227: variable 'ansible_distribution_major_version' from source: facts 13273 1726853304.04231: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853304.04234: variable 'omit' from source: magic vars 13273 1726853304.04248: variable 'omit' from source: magic vars 13273 1726853304.04447: variable 'profile' from source: include params 13273 1726853304.04450: variable 'item' from source: include params 13273 1726853304.04453: variable 'item' from source: include params 13273 1726853304.04455: variable 'omit' from source: magic vars 13273 1726853304.04480: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13273 1726853304.04524: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13273 1726853304.04549: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13273 1726853304.04560: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853304.04572: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853304.04661: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13273 1726853304.04664: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853304.04667: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853304.04878: Set connection var ansible_connection to ssh 13273 1726853304.04881: Set connection var ansible_module_compression to ZIP_DEFLATED 13273 1726853304.04884: Set connection var ansible_shell_executable to /bin/sh 13273 1726853304.04887: Set connection var ansible_shell_type to sh 13273 1726853304.04889: Set connection var ansible_pipelining to False 13273 1726853304.04892: Set connection var ansible_timeout to 10 13273 1726853304.04893: variable 'ansible_shell_executable' from source: unknown 13273 1726853304.04896: variable 'ansible_connection' from source: unknown 13273 1726853304.04898: variable 'ansible_module_compression' from source: unknown 13273 1726853304.04900: variable 'ansible_shell_type' from source: unknown 13273 1726853304.04902: variable 'ansible_shell_executable' from source: unknown 13273 1726853304.04904: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853304.04906: variable 'ansible_pipelining' from source: unknown 13273 1726853304.04908: variable 'ansible_timeout' from source: unknown 13273 1726853304.04910: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853304.05001: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 13273 1726853304.05010: variable 'omit' from source: magic vars 13273 1726853304.05016: starting attempt loop 13273 1726853304.05020: running the handler 13273 1726853304.05041: _low_level_execute_command(): starting 13273 1726853304.05047: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13273 1726853304.05774: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13273 1726853304.05794: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853304.05812: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853304.05889: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853304.05920: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853304.05929: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853304.05948: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853304.06041: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853304.07757: stdout chunk (state=3): >>>/root <<< 13273 1726853304.07885: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853304.07904: stdout chunk (state=3): >>><<< 13273 1726853304.07919: stderr chunk (state=3): >>><<< 13273 1726853304.07947: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853304.07968: _low_level_execute_command(): starting 13273 1726853304.07982: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853304.079539-14349-58480531893917 `" && echo ansible-tmp-1726853304.079539-14349-58480531893917="` echo /root/.ansible/tmp/ansible-tmp-1726853304.079539-14349-58480531893917 `" ) && sleep 0' 13273 1726853304.08624: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13273 1726853304.08640: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853304.08659: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853304.08685: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13273 1726853304.08710: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 13273 1726853304.08738: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853304.08791: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853304.08853: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853304.08884: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853304.08913: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853304.09104: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853304.10993: stdout chunk (state=3): >>>ansible-tmp-1726853304.079539-14349-58480531893917=/root/.ansible/tmp/ansible-tmp-1726853304.079539-14349-58480531893917 <<< 13273 1726853304.11155: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853304.11158: stdout chunk (state=3): >>><<< 13273 1726853304.11161: stderr chunk (state=3): >>><<< 13273 1726853304.11376: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853304.079539-14349-58480531893917=/root/.ansible/tmp/ansible-tmp-1726853304.079539-14349-58480531893917 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853304.11379: variable 'ansible_module_compression' from source: unknown 13273 1726853304.11382: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13273pukop8ph/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 13273 1726853304.11385: variable 'ansible_facts' from source: unknown 13273 1726853304.11435: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853304.079539-14349-58480531893917/AnsiballZ_stat.py 13273 1726853304.11627: Sending initial data 13273 1726853304.11636: Sent initial data (151 bytes) 13273 1726853304.12269: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13273 1726853304.12287: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853304.12304: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853304.12323: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13273 1726853304.12389: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853304.12459: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853304.12493: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853304.12511: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853304.12612: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853304.14247: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13273 1726853304.14319: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13273 1726853304.14381: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13273pukop8ph/tmp2ohdsgth /root/.ansible/tmp/ansible-tmp-1726853304.079539-14349-58480531893917/AnsiballZ_stat.py <<< 13273 1726853304.14385: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853304.079539-14349-58480531893917/AnsiballZ_stat.py" <<< 13273 1726853304.14433: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13273pukop8ph/tmp2ohdsgth" to remote "/root/.ansible/tmp/ansible-tmp-1726853304.079539-14349-58480531893917/AnsiballZ_stat.py" <<< 13273 1726853304.14440: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853304.079539-14349-58480531893917/AnsiballZ_stat.py" <<< 13273 1726853304.15276: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853304.15279: stdout chunk (state=3): >>><<< 13273 1726853304.15281: stderr chunk (state=3): >>><<< 13273 1726853304.15285: done transferring module to remote 13273 1726853304.15296: _low_level_execute_command(): starting 13273 1726853304.15309: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853304.079539-14349-58480531893917/ /root/.ansible/tmp/ansible-tmp-1726853304.079539-14349-58480531893917/AnsiballZ_stat.py && sleep 0' 13273 1726853304.15926: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13273 1726853304.15941: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853304.15958: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853304.15987: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13273 1726853304.16000: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853304.16079: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853304.16094: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853304.16168: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853304.18040: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853304.18068: stderr chunk (state=3): >>><<< 13273 1726853304.18073: stdout chunk (state=3): >>><<< 13273 1726853304.18088: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853304.18091: _low_level_execute_command(): starting 13273 1726853304.18095: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853304.079539-14349-58480531893917/AnsiballZ_stat.py && sleep 0' 13273 1726853304.18518: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853304.18522: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 13273 1726853304.18524: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853304.18526: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853304.18529: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853304.18573: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853304.18603: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853304.18719: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853304.34351: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0.1", "follow": false, "checksum_algorithm": "sha1"}}} <<< 13273 1726853304.35693: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 13273 1726853304.35720: stderr chunk (state=3): >>><<< 13273 1726853304.35724: stdout chunk (state=3): >>><<< 13273 1726853304.35741: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0.1", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 13273 1726853304.35768: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-bond0.1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853304.079539-14349-58480531893917/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13273 1726853304.35776: _low_level_execute_command(): starting 13273 1726853304.35781: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853304.079539-14349-58480531893917/ > /dev/null 2>&1 && sleep 0' 13273 1726853304.36240: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853304.36246: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13273 1726853304.36249: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853304.36251: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853304.36253: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853304.36302: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853304.36314: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853304.36317: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853304.36368: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853304.38301: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853304.38327: stderr chunk (state=3): >>><<< 13273 1726853304.38330: stdout chunk (state=3): >>><<< 13273 1726853304.38342: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853304.38351: handler run complete 13273 1726853304.38366: attempt loop complete, returning result 13273 1726853304.38369: _execute() done 13273 1726853304.38372: dumping result to json 13273 1726853304.38375: done dumping result, returning 13273 1726853304.38384: done running TaskExecutor() for managed_node3/TASK: Stat profile file [02083763-bbaf-5fc3-657d-000000000675] 13273 1726853304.38387: sending task result for task 02083763-bbaf-5fc3-657d-000000000675 13273 1726853304.38481: done sending task result for task 02083763-bbaf-5fc3-657d-000000000675 13273 1726853304.38485: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "stat": { "exists": false } } 13273 1726853304.38536: no more pending results, returning what we have 13273 1726853304.38539: results queue empty 13273 1726853304.38540: checking for any_errors_fatal 13273 1726853304.38549: done checking for any_errors_fatal 13273 1726853304.38549: checking for max_fail_percentage 13273 1726853304.38551: done checking for max_fail_percentage 13273 1726853304.38551: checking to see if all hosts have failed and the running result is not ok 13273 1726853304.38552: done checking to see if all hosts have failed 13273 1726853304.38553: getting the remaining hosts for this loop 13273 1726853304.38554: done getting the remaining hosts for this loop 13273 1726853304.38557: getting the next task for host managed_node3 13273 1726853304.38563: done getting next task for host managed_node3 13273 1726853304.38566: ^ task is: TASK: Set NM profile exist flag based on the profile files 13273 1726853304.38569: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853304.38579: getting variables 13273 1726853304.38580: in VariableManager get_vars() 13273 1726853304.38633: Calling all_inventory to load vars for managed_node3 13273 1726853304.38636: Calling groups_inventory to load vars for managed_node3 13273 1726853304.38638: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853304.38651: Calling all_plugins_play to load vars for managed_node3 13273 1726853304.38653: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853304.38656: Calling groups_plugins_play to load vars for managed_node3 13273 1726853304.39579: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853304.40425: done with get_vars() 13273 1726853304.40439: done getting variables 13273 1726853304.40485: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Friday 20 September 2024 13:28:24 -0400 (0:00:00.376) 0:00:22.294 ****** 13273 1726853304.40506: entering _queue_task() for managed_node3/set_fact 13273 1726853304.40721: worker is 1 (out of 1 available) 13273 1726853304.40733: exiting _queue_task() for managed_node3/set_fact 13273 1726853304.40747: done queuing things up, now waiting for results queue to drain 13273 1726853304.40748: waiting for pending results... 13273 1726853304.40916: running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag based on the profile files 13273 1726853304.40990: in run() - task 02083763-bbaf-5fc3-657d-000000000676 13273 1726853304.41001: variable 'ansible_search_path' from source: unknown 13273 1726853304.41005: variable 'ansible_search_path' from source: unknown 13273 1726853304.41033: calling self._execute() 13273 1726853304.41107: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853304.41113: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853304.41121: variable 'omit' from source: magic vars 13273 1726853304.41385: variable 'ansible_distribution_major_version' from source: facts 13273 1726853304.41394: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853304.41480: variable 'profile_stat' from source: set_fact 13273 1726853304.41491: Evaluated conditional (profile_stat.stat.exists): False 13273 1726853304.41494: when evaluation is False, skipping this task 13273 1726853304.41497: _execute() done 13273 1726853304.41499: dumping result to json 13273 1726853304.41501: done dumping result, returning 13273 1726853304.41509: done running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag based on the profile files [02083763-bbaf-5fc3-657d-000000000676] 13273 1726853304.41515: sending task result for task 02083763-bbaf-5fc3-657d-000000000676 13273 1726853304.41595: done sending task result for task 02083763-bbaf-5fc3-657d-000000000676 13273 1726853304.41599: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 13273 1726853304.41668: no more pending results, returning what we have 13273 1726853304.41673: results queue empty 13273 1726853304.41674: checking for any_errors_fatal 13273 1726853304.41679: done checking for any_errors_fatal 13273 1726853304.41680: checking for max_fail_percentage 13273 1726853304.41681: done checking for max_fail_percentage 13273 1726853304.41682: checking to see if all hosts have failed and the running result is not ok 13273 1726853304.41682: done checking to see if all hosts have failed 13273 1726853304.41683: getting the remaining hosts for this loop 13273 1726853304.41684: done getting the remaining hosts for this loop 13273 1726853304.41686: getting the next task for host managed_node3 13273 1726853304.41692: done getting next task for host managed_node3 13273 1726853304.41694: ^ task is: TASK: Get NM profile info 13273 1726853304.41697: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853304.41701: getting variables 13273 1726853304.41702: in VariableManager get_vars() 13273 1726853304.41741: Calling all_inventory to load vars for managed_node3 13273 1726853304.41746: Calling groups_inventory to load vars for managed_node3 13273 1726853304.41748: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853304.41757: Calling all_plugins_play to load vars for managed_node3 13273 1726853304.41760: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853304.41762: Calling groups_plugins_play to load vars for managed_node3 13273 1726853304.42485: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853304.43334: done with get_vars() 13273 1726853304.43350: done getting variables 13273 1726853304.43395: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Friday 20 September 2024 13:28:24 -0400 (0:00:00.029) 0:00:22.323 ****** 13273 1726853304.43416: entering _queue_task() for managed_node3/shell 13273 1726853304.43620: worker is 1 (out of 1 available) 13273 1726853304.43633: exiting _queue_task() for managed_node3/shell 13273 1726853304.43648: done queuing things up, now waiting for results queue to drain 13273 1726853304.43650: waiting for pending results... 13273 1726853304.43805: running TaskExecutor() for managed_node3/TASK: Get NM profile info 13273 1726853304.43870: in run() - task 02083763-bbaf-5fc3-657d-000000000677 13273 1726853304.43883: variable 'ansible_search_path' from source: unknown 13273 1726853304.43886: variable 'ansible_search_path' from source: unknown 13273 1726853304.43914: calling self._execute() 13273 1726853304.43988: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853304.43992: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853304.44001: variable 'omit' from source: magic vars 13273 1726853304.44261: variable 'ansible_distribution_major_version' from source: facts 13273 1726853304.44272: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853304.44279: variable 'omit' from source: magic vars 13273 1726853304.44314: variable 'omit' from source: magic vars 13273 1726853304.44382: variable 'profile' from source: include params 13273 1726853304.44385: variable 'item' from source: include params 13273 1726853304.44432: variable 'item' from source: include params 13273 1726853304.44448: variable 'omit' from source: magic vars 13273 1726853304.44479: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13273 1726853304.44506: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13273 1726853304.44520: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13273 1726853304.44533: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853304.44547: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853304.44567: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13273 1726853304.44572: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853304.44574: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853304.44637: Set connection var ansible_connection to ssh 13273 1726853304.44647: Set connection var ansible_module_compression to ZIP_DEFLATED 13273 1726853304.44650: Set connection var ansible_shell_executable to /bin/sh 13273 1726853304.44653: Set connection var ansible_shell_type to sh 13273 1726853304.44659: Set connection var ansible_pipelining to False 13273 1726853304.44662: Set connection var ansible_timeout to 10 13273 1726853304.44685: variable 'ansible_shell_executable' from source: unknown 13273 1726853304.44688: variable 'ansible_connection' from source: unknown 13273 1726853304.44691: variable 'ansible_module_compression' from source: unknown 13273 1726853304.44693: variable 'ansible_shell_type' from source: unknown 13273 1726853304.44695: variable 'ansible_shell_executable' from source: unknown 13273 1726853304.44697: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853304.44700: variable 'ansible_pipelining' from source: unknown 13273 1726853304.44702: variable 'ansible_timeout' from source: unknown 13273 1726853304.44706: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853304.44807: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13273 1726853304.44816: variable 'omit' from source: magic vars 13273 1726853304.44821: starting attempt loop 13273 1726853304.44824: running the handler 13273 1726853304.44833: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13273 1726853304.44849: _low_level_execute_command(): starting 13273 1726853304.44856: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13273 1726853304.45380: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853304.45384: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 13273 1726853304.45386: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853304.45389: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853304.45391: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853304.45440: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853304.45444: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853304.45446: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853304.45518: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853304.47219: stdout chunk (state=3): >>>/root <<< 13273 1726853304.47315: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853304.47347: stderr chunk (state=3): >>><<< 13273 1726853304.47350: stdout chunk (state=3): >>><<< 13273 1726853304.47368: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853304.47382: _low_level_execute_command(): starting 13273 1726853304.47389: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853304.4736726-14364-139071666859110 `" && echo ansible-tmp-1726853304.4736726-14364-139071666859110="` echo /root/.ansible/tmp/ansible-tmp-1726853304.4736726-14364-139071666859110 `" ) && sleep 0' 13273 1726853304.47833: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853304.47836: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 13273 1726853304.47839: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853304.47845: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 13273 1726853304.47847: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853304.47892: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853304.47896: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853304.47898: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853304.47958: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853304.49896: stdout chunk (state=3): >>>ansible-tmp-1726853304.4736726-14364-139071666859110=/root/.ansible/tmp/ansible-tmp-1726853304.4736726-14364-139071666859110 <<< 13273 1726853304.50005: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853304.50027: stderr chunk (state=3): >>><<< 13273 1726853304.50030: stdout chunk (state=3): >>><<< 13273 1726853304.50046: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853304.4736726-14364-139071666859110=/root/.ansible/tmp/ansible-tmp-1726853304.4736726-14364-139071666859110 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853304.50068: variable 'ansible_module_compression' from source: unknown 13273 1726853304.50111: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13273pukop8ph/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 13273 1726853304.50138: variable 'ansible_facts' from source: unknown 13273 1726853304.50203: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853304.4736726-14364-139071666859110/AnsiballZ_command.py 13273 1726853304.50483: Sending initial data 13273 1726853304.50486: Sent initial data (156 bytes) 13273 1726853304.51020: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853304.51034: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853304.51119: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853304.52746: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 13273 1726853304.52752: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13273 1726853304.52798: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13273 1726853304.52859: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13273pukop8ph/tmpznqk3qto /root/.ansible/tmp/ansible-tmp-1726853304.4736726-14364-139071666859110/AnsiballZ_command.py <<< 13273 1726853304.52864: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853304.4736726-14364-139071666859110/AnsiballZ_command.py" <<< 13273 1726853304.52916: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13273pukop8ph/tmpznqk3qto" to remote "/root/.ansible/tmp/ansible-tmp-1726853304.4736726-14364-139071666859110/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853304.4736726-14364-139071666859110/AnsiballZ_command.py" <<< 13273 1726853304.53940: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853304.54018: stderr chunk (state=3): >>><<< 13273 1726853304.54021: stdout chunk (state=3): >>><<< 13273 1726853304.54023: done transferring module to remote 13273 1726853304.54035: _low_level_execute_command(): starting 13273 1726853304.54047: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853304.4736726-14364-139071666859110/ /root/.ansible/tmp/ansible-tmp-1726853304.4736726-14364-139071666859110/AnsiballZ_command.py && sleep 0' 13273 1726853304.54684: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853304.54756: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853304.54787: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853304.54868: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853304.56977: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853304.56980: stdout chunk (state=3): >>><<< 13273 1726853304.56983: stderr chunk (state=3): >>><<< 13273 1726853304.56985: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853304.56988: _low_level_execute_command(): starting 13273 1726853304.56990: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853304.4736726-14364-139071666859110/AnsiballZ_command.py && sleep 0' 13273 1726853304.57498: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13273 1726853304.57501: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853304.57522: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853304.57592: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853304.57642: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853304.57656: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853304.57675: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853304.57776: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853304.75509: stdout chunk (state=3): >>> {"changed": true, "stdout": "bond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "start": "2024-09-20 13:28:24.731756", "end": "2024-09-20 13:28:24.752832", "delta": "0:00:00.021076", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 13273 1726853304.77149: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 13273 1726853304.77433: stderr chunk (state=3): >>><<< 13273 1726853304.77437: stdout chunk (state=3): >>><<< 13273 1726853304.77440: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "start": "2024-09-20 13:28:24.731756", "end": "2024-09-20 13:28:24.752832", "delta": "0:00:00.021076", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 13273 1726853304.77447: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853304.4736726-14364-139071666859110/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13273 1726853304.77451: _low_level_execute_command(): starting 13273 1726853304.77454: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853304.4736726-14364-139071666859110/ > /dev/null 2>&1 && sleep 0' 13273 1726853304.78425: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13273 1726853304.78552: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853304.78569: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853304.78589: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13273 1726853304.78612: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 13273 1726853304.78717: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853304.78734: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853304.78756: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853304.78853: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853304.80778: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853304.80788: stdout chunk (state=3): >>><<< 13273 1726853304.80798: stderr chunk (state=3): >>><<< 13273 1726853304.80830: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853304.80886: handler run complete 13273 1726853304.80912: Evaluated conditional (False): False 13273 1726853304.81477: attempt loop complete, returning result 13273 1726853304.81480: _execute() done 13273 1726853304.81483: dumping result to json 13273 1726853304.81485: done dumping result, returning 13273 1726853304.81487: done running TaskExecutor() for managed_node3/TASK: Get NM profile info [02083763-bbaf-5fc3-657d-000000000677] 13273 1726853304.81489: sending task result for task 02083763-bbaf-5fc3-657d-000000000677 13273 1726853304.81563: done sending task result for task 02083763-bbaf-5fc3-657d-000000000677 13273 1726853304.81567: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "delta": "0:00:00.021076", "end": "2024-09-20 13:28:24.752832", "rc": 0, "start": "2024-09-20 13:28:24.731756" } STDOUT: bond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection 13273 1726853304.81635: no more pending results, returning what we have 13273 1726853304.81638: results queue empty 13273 1726853304.81639: checking for any_errors_fatal 13273 1726853304.81646: done checking for any_errors_fatal 13273 1726853304.81646: checking for max_fail_percentage 13273 1726853304.81648: done checking for max_fail_percentage 13273 1726853304.81648: checking to see if all hosts have failed and the running result is not ok 13273 1726853304.81649: done checking to see if all hosts have failed 13273 1726853304.81650: getting the remaining hosts for this loop 13273 1726853304.81651: done getting the remaining hosts for this loop 13273 1726853304.81655: getting the next task for host managed_node3 13273 1726853304.81662: done getting next task for host managed_node3 13273 1726853304.81665: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 13273 1726853304.81669: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853304.81674: getting variables 13273 1726853304.81675: in VariableManager get_vars() 13273 1726853304.81728: Calling all_inventory to load vars for managed_node3 13273 1726853304.81730: Calling groups_inventory to load vars for managed_node3 13273 1726853304.81732: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853304.81742: Calling all_plugins_play to load vars for managed_node3 13273 1726853304.81745: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853304.81748: Calling groups_plugins_play to load vars for managed_node3 13273 1726853304.84817: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853304.88104: done with get_vars() 13273 1726853304.88132: done getting variables 13273 1726853304.88418: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Friday 20 September 2024 13:28:24 -0400 (0:00:00.450) 0:00:22.773 ****** 13273 1726853304.88451: entering _queue_task() for managed_node3/set_fact 13273 1726853304.89386: worker is 1 (out of 1 available) 13273 1726853304.89395: exiting _queue_task() for managed_node3/set_fact 13273 1726853304.89408: done queuing things up, now waiting for results queue to drain 13273 1726853304.89409: waiting for pending results... 13273 1726853304.89721: running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 13273 1726853304.90181: in run() - task 02083763-bbaf-5fc3-657d-000000000678 13273 1726853304.90186: variable 'ansible_search_path' from source: unknown 13273 1726853304.90188: variable 'ansible_search_path' from source: unknown 13273 1726853304.90192: calling self._execute() 13273 1726853304.90639: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853304.90644: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853304.90647: variable 'omit' from source: magic vars 13273 1726853304.91352: variable 'ansible_distribution_major_version' from source: facts 13273 1726853304.91424: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853304.91826: variable 'nm_profile_exists' from source: set_fact 13273 1726853304.91867: Evaluated conditional (nm_profile_exists.rc == 0): True 13273 1726853304.91908: variable 'omit' from source: magic vars 13273 1726853304.92292: variable 'omit' from source: magic vars 13273 1726853304.92296: variable 'omit' from source: magic vars 13273 1726853304.92312: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13273 1726853304.92414: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13273 1726853304.92440: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13273 1726853304.92532: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853304.92564: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853304.92638: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13273 1726853304.92682: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853304.92689: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853304.92845: Set connection var ansible_connection to ssh 13273 1726853304.92862: Set connection var ansible_module_compression to ZIP_DEFLATED 13273 1726853304.92875: Set connection var ansible_shell_executable to /bin/sh 13273 1726853304.92882: Set connection var ansible_shell_type to sh 13273 1726853304.92934: Set connection var ansible_pipelining to False 13273 1726853304.92946: Set connection var ansible_timeout to 10 13273 1726853304.92982: variable 'ansible_shell_executable' from source: unknown 13273 1726853304.92991: variable 'ansible_connection' from source: unknown 13273 1726853304.92998: variable 'ansible_module_compression' from source: unknown 13273 1726853304.93004: variable 'ansible_shell_type' from source: unknown 13273 1726853304.93010: variable 'ansible_shell_executable' from source: unknown 13273 1726853304.93015: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853304.93024: variable 'ansible_pipelining' from source: unknown 13273 1726853304.93030: variable 'ansible_timeout' from source: unknown 13273 1726853304.93047: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853304.93195: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13273 1726853304.93210: variable 'omit' from source: magic vars 13273 1726853304.93220: starting attempt loop 13273 1726853304.93226: running the handler 13273 1726853304.93244: handler run complete 13273 1726853304.93268: attempt loop complete, returning result 13273 1726853304.93366: _execute() done 13273 1726853304.93369: dumping result to json 13273 1726853304.93373: done dumping result, returning 13273 1726853304.93375: done running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [02083763-bbaf-5fc3-657d-000000000678] 13273 1726853304.93377: sending task result for task 02083763-bbaf-5fc3-657d-000000000678 13273 1726853304.93445: done sending task result for task 02083763-bbaf-5fc3-657d-000000000678 13273 1726853304.93449: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 13273 1726853304.93529: no more pending results, returning what we have 13273 1726853304.93532: results queue empty 13273 1726853304.93533: checking for any_errors_fatal 13273 1726853304.93542: done checking for any_errors_fatal 13273 1726853304.93542: checking for max_fail_percentage 13273 1726853304.93544: done checking for max_fail_percentage 13273 1726853304.93545: checking to see if all hosts have failed and the running result is not ok 13273 1726853304.93546: done checking to see if all hosts have failed 13273 1726853304.93546: getting the remaining hosts for this loop 13273 1726853304.93548: done getting the remaining hosts for this loop 13273 1726853304.93551: getting the next task for host managed_node3 13273 1726853304.93560: done getting next task for host managed_node3 13273 1726853304.93562: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 13273 1726853304.93567: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853304.93749: getting variables 13273 1726853304.93752: in VariableManager get_vars() 13273 1726853304.93801: Calling all_inventory to load vars for managed_node3 13273 1726853304.93804: Calling groups_inventory to load vars for managed_node3 13273 1726853304.93806: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853304.93815: Calling all_plugins_play to load vars for managed_node3 13273 1726853304.93818: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853304.93822: Calling groups_plugins_play to load vars for managed_node3 13273 1726853304.95308: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853304.97301: done with get_vars() 13273 1726853304.97322: done getting variables 13273 1726853304.97412: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 13273 1726853304.97522: variable 'profile' from source: include params 13273 1726853304.97526: variable 'item' from source: include params 13273 1726853304.97589: variable 'item' from source: include params TASK [Get the ansible_managed comment in ifcfg-bond0.1] ************************ task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Friday 20 September 2024 13:28:24 -0400 (0:00:00.091) 0:00:22.865 ****** 13273 1726853304.97621: entering _queue_task() for managed_node3/command 13273 1726853304.98134: worker is 1 (out of 1 available) 13273 1726853304.98144: exiting _queue_task() for managed_node3/command 13273 1726853304.98154: done queuing things up, now waiting for results queue to drain 13273 1726853304.98155: waiting for pending results... 13273 1726853304.98398: running TaskExecutor() for managed_node3/TASK: Get the ansible_managed comment in ifcfg-bond0.1 13273 1726853304.98550: in run() - task 02083763-bbaf-5fc3-657d-00000000067a 13273 1726853304.98554: variable 'ansible_search_path' from source: unknown 13273 1726853304.98557: variable 'ansible_search_path' from source: unknown 13273 1726853304.98597: calling self._execute() 13273 1726853304.98706: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853304.98767: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853304.98773: variable 'omit' from source: magic vars 13273 1726853304.99250: variable 'ansible_distribution_major_version' from source: facts 13273 1726853304.99268: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853304.99399: variable 'profile_stat' from source: set_fact 13273 1726853304.99415: Evaluated conditional (profile_stat.stat.exists): False 13273 1726853304.99427: when evaluation is False, skipping this task 13273 1726853304.99433: _execute() done 13273 1726853304.99439: dumping result to json 13273 1726853304.99444: done dumping result, returning 13273 1726853304.99463: done running TaskExecutor() for managed_node3/TASK: Get the ansible_managed comment in ifcfg-bond0.1 [02083763-bbaf-5fc3-657d-00000000067a] 13273 1726853304.99466: sending task result for task 02083763-bbaf-5fc3-657d-00000000067a 13273 1726853304.99684: done sending task result for task 02083763-bbaf-5fc3-657d-00000000067a 13273 1726853304.99688: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 13273 1726853304.99743: no more pending results, returning what we have 13273 1726853304.99748: results queue empty 13273 1726853304.99749: checking for any_errors_fatal 13273 1726853304.99757: done checking for any_errors_fatal 13273 1726853304.99758: checking for max_fail_percentage 13273 1726853304.99760: done checking for max_fail_percentage 13273 1726853304.99761: checking to see if all hosts have failed and the running result is not ok 13273 1726853304.99762: done checking to see if all hosts have failed 13273 1726853304.99762: getting the remaining hosts for this loop 13273 1726853304.99764: done getting the remaining hosts for this loop 13273 1726853304.99767: getting the next task for host managed_node3 13273 1726853304.99777: done getting next task for host managed_node3 13273 1726853304.99780: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 13273 1726853304.99785: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853304.99791: getting variables 13273 1726853304.99793: in VariableManager get_vars() 13273 1726853304.99850: Calling all_inventory to load vars for managed_node3 13273 1726853304.99854: Calling groups_inventory to load vars for managed_node3 13273 1726853304.99856: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853304.99869: Calling all_plugins_play to load vars for managed_node3 13273 1726853305.00009: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853305.00014: Calling groups_plugins_play to load vars for managed_node3 13273 1726853305.05636: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853305.07843: done with get_vars() 13273 1726853305.08002: done getting variables 13273 1726853305.08054: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 13273 1726853305.08261: variable 'profile' from source: include params 13273 1726853305.08265: variable 'item' from source: include params 13273 1726853305.08424: variable 'item' from source: include params TASK [Verify the ansible_managed comment in ifcfg-bond0.1] ********************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Friday 20 September 2024 13:28:25 -0400 (0:00:00.108) 0:00:22.973 ****** 13273 1726853305.08451: entering _queue_task() for managed_node3/set_fact 13273 1726853305.09101: worker is 1 (out of 1 available) 13273 1726853305.09114: exiting _queue_task() for managed_node3/set_fact 13273 1726853305.09127: done queuing things up, now waiting for results queue to drain 13273 1726853305.09129: waiting for pending results... 13273 1726853305.09793: running TaskExecutor() for managed_node3/TASK: Verify the ansible_managed comment in ifcfg-bond0.1 13273 1726853305.10077: in run() - task 02083763-bbaf-5fc3-657d-00000000067b 13273 1726853305.10082: variable 'ansible_search_path' from source: unknown 13273 1726853305.10084: variable 'ansible_search_path' from source: unknown 13273 1726853305.10087: calling self._execute() 13273 1726853305.10274: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853305.10288: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853305.10301: variable 'omit' from source: magic vars 13273 1726853305.10961: variable 'ansible_distribution_major_version' from source: facts 13273 1726853305.10976: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853305.11118: variable 'profile_stat' from source: set_fact 13273 1726853305.11138: Evaluated conditional (profile_stat.stat.exists): False 13273 1726853305.11148: when evaluation is False, skipping this task 13273 1726853305.11155: _execute() done 13273 1726853305.11164: dumping result to json 13273 1726853305.11174: done dumping result, returning 13273 1726853305.11187: done running TaskExecutor() for managed_node3/TASK: Verify the ansible_managed comment in ifcfg-bond0.1 [02083763-bbaf-5fc3-657d-00000000067b] 13273 1726853305.11197: sending task result for task 02083763-bbaf-5fc3-657d-00000000067b skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 13273 1726853305.11375: no more pending results, returning what we have 13273 1726853305.11379: results queue empty 13273 1726853305.11380: checking for any_errors_fatal 13273 1726853305.11387: done checking for any_errors_fatal 13273 1726853305.11388: checking for max_fail_percentage 13273 1726853305.11390: done checking for max_fail_percentage 13273 1726853305.11391: checking to see if all hosts have failed and the running result is not ok 13273 1726853305.11392: done checking to see if all hosts have failed 13273 1726853305.11392: getting the remaining hosts for this loop 13273 1726853305.11394: done getting the remaining hosts for this loop 13273 1726853305.11398: getting the next task for host managed_node3 13273 1726853305.11406: done getting next task for host managed_node3 13273 1726853305.11409: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 13273 1726853305.11413: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853305.11418: getting variables 13273 1726853305.11420: in VariableManager get_vars() 13273 1726853305.11625: Calling all_inventory to load vars for managed_node3 13273 1726853305.11628: Calling groups_inventory to load vars for managed_node3 13273 1726853305.11631: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853305.11645: Calling all_plugins_play to load vars for managed_node3 13273 1726853305.11648: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853305.11651: Calling groups_plugins_play to load vars for managed_node3 13273 1726853305.12216: done sending task result for task 02083763-bbaf-5fc3-657d-00000000067b 13273 1726853305.12220: WORKER PROCESS EXITING 13273 1726853305.13543: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853305.15361: done with get_vars() 13273 1726853305.15385: done getting variables 13273 1726853305.15443: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 13273 1726853305.15553: variable 'profile' from source: include params 13273 1726853305.15561: variable 'item' from source: include params 13273 1726853305.15621: variable 'item' from source: include params TASK [Get the fingerprint comment in ifcfg-bond0.1] **************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Friday 20 September 2024 13:28:25 -0400 (0:00:00.071) 0:00:23.045 ****** 13273 1726853305.15650: entering _queue_task() for managed_node3/command 13273 1726853305.16083: worker is 1 (out of 1 available) 13273 1726853305.16093: exiting _queue_task() for managed_node3/command 13273 1726853305.16109: done queuing things up, now waiting for results queue to drain 13273 1726853305.16110: waiting for pending results... 13273 1726853305.16319: running TaskExecutor() for managed_node3/TASK: Get the fingerprint comment in ifcfg-bond0.1 13273 1726853305.16466: in run() - task 02083763-bbaf-5fc3-657d-00000000067c 13273 1726853305.16489: variable 'ansible_search_path' from source: unknown 13273 1726853305.16496: variable 'ansible_search_path' from source: unknown 13273 1726853305.16543: calling self._execute() 13273 1726853305.16708: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853305.16745: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853305.16980: variable 'omit' from source: magic vars 13273 1726853305.17523: variable 'ansible_distribution_major_version' from source: facts 13273 1726853305.17548: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853305.17791: variable 'profile_stat' from source: set_fact 13273 1726853305.17821: Evaluated conditional (profile_stat.stat.exists): False 13273 1726853305.17830: when evaluation is False, skipping this task 13273 1726853305.17839: _execute() done 13273 1726853305.17848: dumping result to json 13273 1726853305.17883: done dumping result, returning 13273 1726853305.18082: done running TaskExecutor() for managed_node3/TASK: Get the fingerprint comment in ifcfg-bond0.1 [02083763-bbaf-5fc3-657d-00000000067c] 13273 1726853305.18085: sending task result for task 02083763-bbaf-5fc3-657d-00000000067c 13273 1726853305.18153: done sending task result for task 02083763-bbaf-5fc3-657d-00000000067c 13273 1726853305.18156: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 13273 1726853305.18210: no more pending results, returning what we have 13273 1726853305.18214: results queue empty 13273 1726853305.18215: checking for any_errors_fatal 13273 1726853305.18223: done checking for any_errors_fatal 13273 1726853305.18223: checking for max_fail_percentage 13273 1726853305.18225: done checking for max_fail_percentage 13273 1726853305.18226: checking to see if all hosts have failed and the running result is not ok 13273 1726853305.18226: done checking to see if all hosts have failed 13273 1726853305.18227: getting the remaining hosts for this loop 13273 1726853305.18229: done getting the remaining hosts for this loop 13273 1726853305.18232: getting the next task for host managed_node3 13273 1726853305.18239: done getting next task for host managed_node3 13273 1726853305.18242: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 13273 1726853305.18246: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853305.18251: getting variables 13273 1726853305.18253: in VariableManager get_vars() 13273 1726853305.18312: Calling all_inventory to load vars for managed_node3 13273 1726853305.18316: Calling groups_inventory to load vars for managed_node3 13273 1726853305.18319: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853305.18332: Calling all_plugins_play to load vars for managed_node3 13273 1726853305.18335: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853305.18338: Calling groups_plugins_play to load vars for managed_node3 13273 1726853305.20432: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853305.22628: done with get_vars() 13273 1726853305.22676: done getting variables 13273 1726853305.22811: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 13273 1726853305.22947: variable 'profile' from source: include params 13273 1726853305.22950: variable 'item' from source: include params 13273 1726853305.23112: variable 'item' from source: include params TASK [Verify the fingerprint comment in ifcfg-bond0.1] ************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Friday 20 September 2024 13:28:25 -0400 (0:00:00.074) 0:00:23.120 ****** 13273 1726853305.23142: entering _queue_task() for managed_node3/set_fact 13273 1726853305.23924: worker is 1 (out of 1 available) 13273 1726853305.23936: exiting _queue_task() for managed_node3/set_fact 13273 1726853305.23949: done queuing things up, now waiting for results queue to drain 13273 1726853305.23950: waiting for pending results... 13273 1726853305.24429: running TaskExecutor() for managed_node3/TASK: Verify the fingerprint comment in ifcfg-bond0.1 13273 1726853305.24435: in run() - task 02083763-bbaf-5fc3-657d-00000000067d 13273 1726853305.24438: variable 'ansible_search_path' from source: unknown 13273 1726853305.24443: variable 'ansible_search_path' from source: unknown 13273 1726853305.24485: calling self._execute() 13273 1726853305.24590: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853305.24603: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853305.24617: variable 'omit' from source: magic vars 13273 1726853305.25004: variable 'ansible_distribution_major_version' from source: facts 13273 1726853305.25022: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853305.25150: variable 'profile_stat' from source: set_fact 13273 1726853305.25175: Evaluated conditional (profile_stat.stat.exists): False 13273 1726853305.25184: when evaluation is False, skipping this task 13273 1726853305.25279: _execute() done 13273 1726853305.25282: dumping result to json 13273 1726853305.25284: done dumping result, returning 13273 1726853305.25287: done running TaskExecutor() for managed_node3/TASK: Verify the fingerprint comment in ifcfg-bond0.1 [02083763-bbaf-5fc3-657d-00000000067d] 13273 1726853305.25289: sending task result for task 02083763-bbaf-5fc3-657d-00000000067d 13273 1726853305.25357: done sending task result for task 02083763-bbaf-5fc3-657d-00000000067d 13273 1726853305.25360: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 13273 1726853305.25430: no more pending results, returning what we have 13273 1726853305.25434: results queue empty 13273 1726853305.25435: checking for any_errors_fatal 13273 1726853305.25440: done checking for any_errors_fatal 13273 1726853305.25441: checking for max_fail_percentage 13273 1726853305.25443: done checking for max_fail_percentage 13273 1726853305.25443: checking to see if all hosts have failed and the running result is not ok 13273 1726853305.25444: done checking to see if all hosts have failed 13273 1726853305.25445: getting the remaining hosts for this loop 13273 1726853305.25446: done getting the remaining hosts for this loop 13273 1726853305.25450: getting the next task for host managed_node3 13273 1726853305.25457: done getting next task for host managed_node3 13273 1726853305.25459: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 13273 1726853305.25463: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853305.25474: getting variables 13273 1726853305.25476: in VariableManager get_vars() 13273 1726853305.25531: Calling all_inventory to load vars for managed_node3 13273 1726853305.25534: Calling groups_inventory to load vars for managed_node3 13273 1726853305.25536: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853305.25548: Calling all_plugins_play to load vars for managed_node3 13273 1726853305.25552: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853305.25555: Calling groups_plugins_play to load vars for managed_node3 13273 1726853305.28285: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853305.29152: done with get_vars() 13273 1726853305.29167: done getting variables 13273 1726853305.29212: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 13273 1726853305.29296: variable 'profile' from source: include params 13273 1726853305.29299: variable 'item' from source: include params 13273 1726853305.29338: variable 'item' from source: include params TASK [Assert that the profile is present - 'bond0.1'] ************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Friday 20 September 2024 13:28:25 -0400 (0:00:00.062) 0:00:23.182 ****** 13273 1726853305.29361: entering _queue_task() for managed_node3/assert 13273 1726853305.29601: worker is 1 (out of 1 available) 13273 1726853305.29616: exiting _queue_task() for managed_node3/assert 13273 1726853305.29629: done queuing things up, now waiting for results queue to drain 13273 1726853305.29630: waiting for pending results... 13273 1726853305.29812: running TaskExecutor() for managed_node3/TASK: Assert that the profile is present - 'bond0.1' 13273 1726853305.29946: in run() - task 02083763-bbaf-5fc3-657d-000000000364 13273 1726853305.29961: variable 'ansible_search_path' from source: unknown 13273 1726853305.29965: variable 'ansible_search_path' from source: unknown 13273 1726853305.30020: calling self._execute() 13273 1726853305.30108: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853305.30112: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853305.30115: variable 'omit' from source: magic vars 13273 1726853305.30501: variable 'ansible_distribution_major_version' from source: facts 13273 1726853305.30505: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853305.30508: variable 'omit' from source: magic vars 13273 1726853305.30532: variable 'omit' from source: magic vars 13273 1726853305.30677: variable 'profile' from source: include params 13273 1726853305.30681: variable 'item' from source: include params 13273 1726853305.30699: variable 'item' from source: include params 13273 1726853305.30718: variable 'omit' from source: magic vars 13273 1726853305.30780: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13273 1726853305.30799: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13273 1726853305.30824: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13273 1726853305.30835: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853305.30849: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853305.30880: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13273 1726853305.30884: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853305.30886: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853305.31013: Set connection var ansible_connection to ssh 13273 1726853305.31016: Set connection var ansible_module_compression to ZIP_DEFLATED 13273 1726853305.31020: Set connection var ansible_shell_executable to /bin/sh 13273 1726853305.31023: Set connection var ansible_shell_type to sh 13273 1726853305.31025: Set connection var ansible_pipelining to False 13273 1726853305.31031: Set connection var ansible_timeout to 10 13273 1726853305.31040: variable 'ansible_shell_executable' from source: unknown 13273 1726853305.31048: variable 'ansible_connection' from source: unknown 13273 1726853305.31050: variable 'ansible_module_compression' from source: unknown 13273 1726853305.31053: variable 'ansible_shell_type' from source: unknown 13273 1726853305.31055: variable 'ansible_shell_executable' from source: unknown 13273 1726853305.31058: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853305.31060: variable 'ansible_pipelining' from source: unknown 13273 1726853305.31062: variable 'ansible_timeout' from source: unknown 13273 1726853305.31065: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853305.31200: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13273 1726853305.31248: variable 'omit' from source: magic vars 13273 1726853305.31253: starting attempt loop 13273 1726853305.31256: running the handler 13273 1726853305.31450: variable 'lsr_net_profile_exists' from source: set_fact 13273 1726853305.31453: Evaluated conditional (lsr_net_profile_exists): True 13273 1726853305.31455: handler run complete 13273 1726853305.31457: attempt loop complete, returning result 13273 1726853305.31459: _execute() done 13273 1726853305.31461: dumping result to json 13273 1726853305.31463: done dumping result, returning 13273 1726853305.31464: done running TaskExecutor() for managed_node3/TASK: Assert that the profile is present - 'bond0.1' [02083763-bbaf-5fc3-657d-000000000364] 13273 1726853305.31466: sending task result for task 02083763-bbaf-5fc3-657d-000000000364 13273 1726853305.31526: done sending task result for task 02083763-bbaf-5fc3-657d-000000000364 13273 1726853305.31529: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 13273 1726853305.31576: no more pending results, returning what we have 13273 1726853305.31579: results queue empty 13273 1726853305.31580: checking for any_errors_fatal 13273 1726853305.31585: done checking for any_errors_fatal 13273 1726853305.31586: checking for max_fail_percentage 13273 1726853305.31587: done checking for max_fail_percentage 13273 1726853305.31588: checking to see if all hosts have failed and the running result is not ok 13273 1726853305.31589: done checking to see if all hosts have failed 13273 1726853305.31590: getting the remaining hosts for this loop 13273 1726853305.31591: done getting the remaining hosts for this loop 13273 1726853305.31594: getting the next task for host managed_node3 13273 1726853305.31599: done getting next task for host managed_node3 13273 1726853305.31601: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 13273 1726853305.31604: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853305.31608: getting variables 13273 1726853305.31609: in VariableManager get_vars() 13273 1726853305.31648: Calling all_inventory to load vars for managed_node3 13273 1726853305.31651: Calling groups_inventory to load vars for managed_node3 13273 1726853305.31653: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853305.31661: Calling all_plugins_play to load vars for managed_node3 13273 1726853305.31663: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853305.31665: Calling groups_plugins_play to load vars for managed_node3 13273 1726853305.32768: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853305.33641: done with get_vars() 13273 1726853305.33658: done getting variables 13273 1726853305.33699: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 13273 1726853305.33781: variable 'profile' from source: include params 13273 1726853305.33784: variable 'item' from source: include params 13273 1726853305.33821: variable 'item' from source: include params TASK [Assert that the ansible managed comment is present in 'bond0.1'] ********* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Friday 20 September 2024 13:28:25 -0400 (0:00:00.044) 0:00:23.227 ****** 13273 1726853305.33850: entering _queue_task() for managed_node3/assert 13273 1726853305.34072: worker is 1 (out of 1 available) 13273 1726853305.34087: exiting _queue_task() for managed_node3/assert 13273 1726853305.34099: done queuing things up, now waiting for results queue to drain 13273 1726853305.34100: waiting for pending results... 13273 1726853305.34390: running TaskExecutor() for managed_node3/TASK: Assert that the ansible managed comment is present in 'bond0.1' 13273 1726853305.34449: in run() - task 02083763-bbaf-5fc3-657d-000000000365 13273 1726853305.34473: variable 'ansible_search_path' from source: unknown 13273 1726853305.34482: variable 'ansible_search_path' from source: unknown 13273 1726853305.34530: calling self._execute() 13273 1726853305.34627: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853305.34677: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853305.34681: variable 'omit' from source: magic vars 13273 1726853305.34999: variable 'ansible_distribution_major_version' from source: facts 13273 1726853305.35016: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853305.35030: variable 'omit' from source: magic vars 13273 1726853305.35070: variable 'omit' from source: magic vars 13273 1726853305.35184: variable 'profile' from source: include params 13273 1726853305.35192: variable 'item' from source: include params 13273 1726853305.35236: variable 'item' from source: include params 13273 1726853305.35259: variable 'omit' from source: magic vars 13273 1726853305.35289: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13273 1726853305.35316: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13273 1726853305.35330: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13273 1726853305.35354: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853305.35358: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853305.35381: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13273 1726853305.35384: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853305.35389: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853305.35460: Set connection var ansible_connection to ssh 13273 1726853305.35469: Set connection var ansible_module_compression to ZIP_DEFLATED 13273 1726853305.35474: Set connection var ansible_shell_executable to /bin/sh 13273 1726853305.35477: Set connection var ansible_shell_type to sh 13273 1726853305.35482: Set connection var ansible_pipelining to False 13273 1726853305.35487: Set connection var ansible_timeout to 10 13273 1726853305.35506: variable 'ansible_shell_executable' from source: unknown 13273 1726853305.35509: variable 'ansible_connection' from source: unknown 13273 1726853305.35511: variable 'ansible_module_compression' from source: unknown 13273 1726853305.35514: variable 'ansible_shell_type' from source: unknown 13273 1726853305.35516: variable 'ansible_shell_executable' from source: unknown 13273 1726853305.35519: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853305.35521: variable 'ansible_pipelining' from source: unknown 13273 1726853305.35523: variable 'ansible_timeout' from source: unknown 13273 1726853305.35528: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853305.35647: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13273 1726853305.35653: variable 'omit' from source: magic vars 13273 1726853305.35659: starting attempt loop 13273 1726853305.35661: running the handler 13273 1726853305.35736: variable 'lsr_net_profile_ansible_managed' from source: set_fact 13273 1726853305.35739: Evaluated conditional (lsr_net_profile_ansible_managed): True 13273 1726853305.35747: handler run complete 13273 1726853305.35757: attempt loop complete, returning result 13273 1726853305.35760: _execute() done 13273 1726853305.35763: dumping result to json 13273 1726853305.35765: done dumping result, returning 13273 1726853305.35772: done running TaskExecutor() for managed_node3/TASK: Assert that the ansible managed comment is present in 'bond0.1' [02083763-bbaf-5fc3-657d-000000000365] 13273 1726853305.35777: sending task result for task 02083763-bbaf-5fc3-657d-000000000365 13273 1726853305.35860: done sending task result for task 02083763-bbaf-5fc3-657d-000000000365 13273 1726853305.35863: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 13273 1726853305.35932: no more pending results, returning what we have 13273 1726853305.35935: results queue empty 13273 1726853305.35936: checking for any_errors_fatal 13273 1726853305.35941: done checking for any_errors_fatal 13273 1726853305.35941: checking for max_fail_percentage 13273 1726853305.35946: done checking for max_fail_percentage 13273 1726853305.35946: checking to see if all hosts have failed and the running result is not ok 13273 1726853305.35947: done checking to see if all hosts have failed 13273 1726853305.35948: getting the remaining hosts for this loop 13273 1726853305.35949: done getting the remaining hosts for this loop 13273 1726853305.35952: getting the next task for host managed_node3 13273 1726853305.35957: done getting next task for host managed_node3 13273 1726853305.35959: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 13273 1726853305.35962: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853305.35965: getting variables 13273 1726853305.35966: in VariableManager get_vars() 13273 1726853305.36009: Calling all_inventory to load vars for managed_node3 13273 1726853305.36012: Calling groups_inventory to load vars for managed_node3 13273 1726853305.36014: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853305.36022: Calling all_plugins_play to load vars for managed_node3 13273 1726853305.36024: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853305.36027: Calling groups_plugins_play to load vars for managed_node3 13273 1726853305.36886: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853305.37731: done with get_vars() 13273 1726853305.37748: done getting variables 13273 1726853305.37788: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 13273 1726853305.37861: variable 'profile' from source: include params 13273 1726853305.37864: variable 'item' from source: include params 13273 1726853305.37902: variable 'item' from source: include params TASK [Assert that the fingerprint comment is present in bond0.1] *************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Friday 20 September 2024 13:28:25 -0400 (0:00:00.040) 0:00:23.268 ****** 13273 1726853305.37929: entering _queue_task() for managed_node3/assert 13273 1726853305.38137: worker is 1 (out of 1 available) 13273 1726853305.38152: exiting _queue_task() for managed_node3/assert 13273 1726853305.38163: done queuing things up, now waiting for results queue to drain 13273 1726853305.38164: waiting for pending results... 13273 1726853305.38342: running TaskExecutor() for managed_node3/TASK: Assert that the fingerprint comment is present in bond0.1 13273 1726853305.38408: in run() - task 02083763-bbaf-5fc3-657d-000000000366 13273 1726853305.38419: variable 'ansible_search_path' from source: unknown 13273 1726853305.38423: variable 'ansible_search_path' from source: unknown 13273 1726853305.38453: calling self._execute() 13273 1726853305.38532: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853305.38537: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853305.38548: variable 'omit' from source: magic vars 13273 1726853305.38816: variable 'ansible_distribution_major_version' from source: facts 13273 1726853305.38830: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853305.38833: variable 'omit' from source: magic vars 13273 1726853305.38861: variable 'omit' from source: magic vars 13273 1726853305.38928: variable 'profile' from source: include params 13273 1726853305.38934: variable 'item' from source: include params 13273 1726853305.38982: variable 'item' from source: include params 13273 1726853305.38996: variable 'omit' from source: magic vars 13273 1726853305.39027: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13273 1726853305.39056: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13273 1726853305.39074: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13273 1726853305.39088: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853305.39097: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853305.39121: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13273 1726853305.39124: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853305.39126: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853305.39196: Set connection var ansible_connection to ssh 13273 1726853305.39205: Set connection var ansible_module_compression to ZIP_DEFLATED 13273 1726853305.39210: Set connection var ansible_shell_executable to /bin/sh 13273 1726853305.39213: Set connection var ansible_shell_type to sh 13273 1726853305.39218: Set connection var ansible_pipelining to False 13273 1726853305.39223: Set connection var ansible_timeout to 10 13273 1726853305.39244: variable 'ansible_shell_executable' from source: unknown 13273 1726853305.39248: variable 'ansible_connection' from source: unknown 13273 1726853305.39251: variable 'ansible_module_compression' from source: unknown 13273 1726853305.39254: variable 'ansible_shell_type' from source: unknown 13273 1726853305.39257: variable 'ansible_shell_executable' from source: unknown 13273 1726853305.39259: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853305.39263: variable 'ansible_pipelining' from source: unknown 13273 1726853305.39266: variable 'ansible_timeout' from source: unknown 13273 1726853305.39269: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853305.39367: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13273 1726853305.39377: variable 'omit' from source: magic vars 13273 1726853305.39383: starting attempt loop 13273 1726853305.39386: running the handler 13273 1726853305.39462: variable 'lsr_net_profile_fingerprint' from source: set_fact 13273 1726853305.39465: Evaluated conditional (lsr_net_profile_fingerprint): True 13273 1726853305.39476: handler run complete 13273 1726853305.39490: attempt loop complete, returning result 13273 1726853305.39496: _execute() done 13273 1726853305.39499: dumping result to json 13273 1726853305.39502: done dumping result, returning 13273 1726853305.39504: done running TaskExecutor() for managed_node3/TASK: Assert that the fingerprint comment is present in bond0.1 [02083763-bbaf-5fc3-657d-000000000366] 13273 1726853305.39513: sending task result for task 02083763-bbaf-5fc3-657d-000000000366 13273 1726853305.39584: done sending task result for task 02083763-bbaf-5fc3-657d-000000000366 13273 1726853305.39587: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 13273 1726853305.39663: no more pending results, returning what we have 13273 1726853305.39666: results queue empty 13273 1726853305.39667: checking for any_errors_fatal 13273 1726853305.39673: done checking for any_errors_fatal 13273 1726853305.39674: checking for max_fail_percentage 13273 1726853305.39675: done checking for max_fail_percentage 13273 1726853305.39676: checking to see if all hosts have failed and the running result is not ok 13273 1726853305.39677: done checking to see if all hosts have failed 13273 1726853305.39678: getting the remaining hosts for this loop 13273 1726853305.39679: done getting the remaining hosts for this loop 13273 1726853305.39682: getting the next task for host managed_node3 13273 1726853305.39687: done getting next task for host managed_node3 13273 1726853305.39690: ^ task is: TASK: ** TEST check polling interval 13273 1726853305.39692: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853305.39695: getting variables 13273 1726853305.39697: in VariableManager get_vars() 13273 1726853305.39743: Calling all_inventory to load vars for managed_node3 13273 1726853305.39746: Calling groups_inventory to load vars for managed_node3 13273 1726853305.39748: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853305.39756: Calling all_plugins_play to load vars for managed_node3 13273 1726853305.39758: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853305.39761: Calling groups_plugins_play to load vars for managed_node3 13273 1726853305.40512: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853305.41465: done with get_vars() 13273 1726853305.41483: done getting variables 13273 1726853305.41522: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [** TEST check polling interval] ****************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:75 Friday 20 September 2024 13:28:25 -0400 (0:00:00.036) 0:00:23.304 ****** 13273 1726853305.41542: entering _queue_task() for managed_node3/command 13273 1726853305.41775: worker is 1 (out of 1 available) 13273 1726853305.41788: exiting _queue_task() for managed_node3/command 13273 1726853305.41800: done queuing things up, now waiting for results queue to drain 13273 1726853305.41801: waiting for pending results... 13273 1726853305.41978: running TaskExecutor() for managed_node3/TASK: ** TEST check polling interval 13273 1726853305.42038: in run() - task 02083763-bbaf-5fc3-657d-000000000071 13273 1726853305.42052: variable 'ansible_search_path' from source: unknown 13273 1726853305.42083: calling self._execute() 13273 1726853305.42157: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853305.42161: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853305.42170: variable 'omit' from source: magic vars 13273 1726853305.42438: variable 'ansible_distribution_major_version' from source: facts 13273 1726853305.42450: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853305.42456: variable 'omit' from source: magic vars 13273 1726853305.42476: variable 'omit' from source: magic vars 13273 1726853305.42541: variable 'controller_device' from source: play vars 13273 1726853305.42558: variable 'omit' from source: magic vars 13273 1726853305.42592: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13273 1726853305.42619: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13273 1726853305.42635: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13273 1726853305.42652: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853305.42661: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853305.42688: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13273 1726853305.42691: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853305.42694: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853305.42760: Set connection var ansible_connection to ssh 13273 1726853305.42768: Set connection var ansible_module_compression to ZIP_DEFLATED 13273 1726853305.42775: Set connection var ansible_shell_executable to /bin/sh 13273 1726853305.42777: Set connection var ansible_shell_type to sh 13273 1726853305.42785: Set connection var ansible_pipelining to False 13273 1726853305.42790: Set connection var ansible_timeout to 10 13273 1726853305.42810: variable 'ansible_shell_executable' from source: unknown 13273 1726853305.42813: variable 'ansible_connection' from source: unknown 13273 1726853305.42816: variable 'ansible_module_compression' from source: unknown 13273 1726853305.42818: variable 'ansible_shell_type' from source: unknown 13273 1726853305.42821: variable 'ansible_shell_executable' from source: unknown 13273 1726853305.42823: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853305.42825: variable 'ansible_pipelining' from source: unknown 13273 1726853305.42828: variable 'ansible_timeout' from source: unknown 13273 1726853305.42832: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853305.42936: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13273 1726853305.42944: variable 'omit' from source: magic vars 13273 1726853305.42952: starting attempt loop 13273 1726853305.42955: running the handler 13273 1726853305.42968: _low_level_execute_command(): starting 13273 1726853305.42976: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13273 1726853305.43495: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853305.43500: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address <<< 13273 1726853305.43503: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 13273 1726853305.43507: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853305.43539: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853305.43551: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853305.43630: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853305.45384: stdout chunk (state=3): >>>/root <<< 13273 1726853305.45532: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853305.45535: stdout chunk (state=3): >>><<< 13273 1726853305.45537: stderr chunk (state=3): >>><<< 13273 1726853305.45570: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853305.45595: _low_level_execute_command(): starting 13273 1726853305.45683: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853305.455802-14410-222652350738004 `" && echo ansible-tmp-1726853305.455802-14410-222652350738004="` echo /root/.ansible/tmp/ansible-tmp-1726853305.455802-14410-222652350738004 `" ) && sleep 0' 13273 1726853305.46239: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853305.46258: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853305.46313: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853305.46361: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853305.46364: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853305.46366: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853305.46421: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853305.48604: stdout chunk (state=3): >>>ansible-tmp-1726853305.455802-14410-222652350738004=/root/.ansible/tmp/ansible-tmp-1726853305.455802-14410-222652350738004 <<< 13273 1726853305.48651: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853305.48655: stdout chunk (state=3): >>><<< 13273 1726853305.48776: stderr chunk (state=3): >>><<< 13273 1726853305.48780: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853305.455802-14410-222652350738004=/root/.ansible/tmp/ansible-tmp-1726853305.455802-14410-222652350738004 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853305.48783: variable 'ansible_module_compression' from source: unknown 13273 1726853305.48785: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13273pukop8ph/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 13273 1726853305.48799: variable 'ansible_facts' from source: unknown 13273 1726853305.48982: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853305.455802-14410-222652350738004/AnsiballZ_command.py 13273 1726853305.49198: Sending initial data 13273 1726853305.49202: Sent initial data (155 bytes) 13273 1726853305.49702: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13273 1726853305.49712: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853305.49723: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853305.49791: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853305.49878: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853305.49887: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853305.49890: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853305.49944: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853305.51641: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 13273 1726853305.51647: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 13273 1726853305.51649: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 <<< 13273 1726853305.51652: stderr chunk (state=3): >>>debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13273 1726853305.51700: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13273 1726853305.51769: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13273pukop8ph/tmpsuzlapss /root/.ansible/tmp/ansible-tmp-1726853305.455802-14410-222652350738004/AnsiballZ_command.py <<< 13273 1726853305.51774: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853305.455802-14410-222652350738004/AnsiballZ_command.py" <<< 13273 1726853305.51822: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13273pukop8ph/tmpsuzlapss" to remote "/root/.ansible/tmp/ansible-tmp-1726853305.455802-14410-222652350738004/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853305.455802-14410-222652350738004/AnsiballZ_command.py" <<< 13273 1726853305.52785: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853305.52789: stdout chunk (state=3): >>><<< 13273 1726853305.52791: stderr chunk (state=3): >>><<< 13273 1726853305.52793: done transferring module to remote 13273 1726853305.52795: _low_level_execute_command(): starting 13273 1726853305.52798: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853305.455802-14410-222652350738004/ /root/.ansible/tmp/ansible-tmp-1726853305.455802-14410-222652350738004/AnsiballZ_command.py && sleep 0' 13273 1726853305.53359: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13273 1726853305.53374: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853305.53387: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853305.53403: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13273 1726853305.53416: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 13273 1726853305.53427: stderr chunk (state=3): >>>debug2: match not found <<< 13273 1726853305.53447: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853305.53536: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853305.53566: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853305.53667: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853305.55522: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853305.55582: stderr chunk (state=3): >>><<< 13273 1726853305.55586: stdout chunk (state=3): >>><<< 13273 1726853305.55595: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853305.55598: _low_level_execute_command(): starting 13273 1726853305.55601: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853305.455802-14410-222652350738004/AnsiballZ_command.py && sleep 0' 13273 1726853305.56253: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13273 1726853305.56256: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853305.56259: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853305.56261: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13273 1726853305.56264: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 13273 1726853305.56266: stderr chunk (state=3): >>>debug2: match not found <<< 13273 1726853305.56268: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853305.56274: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13273 1726853305.56276: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address <<< 13273 1726853305.56279: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13273 1726853305.56281: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853305.56283: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853305.56296: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13273 1726853305.56303: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 13273 1726853305.56311: stderr chunk (state=3): >>>debug2: match found <<< 13273 1726853305.56320: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853305.56391: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853305.56439: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853305.56442: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853305.56505: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853305.72760: stdout chunk (state=3): >>> {"changed": true, "stdout": "MII Polling Interval (ms): 110", "stderr": "", "rc": 0, "cmd": ["grep", "Polling Interval", "/proc/net/bonding/nm-bond"], "start": "2024-09-20 13:28:25.722203", "end": "2024-09-20 13:28:25.726422", "delta": "0:00:00.004219", "msg": "", "invocation": {"module_args": {"_raw_params": "grep 'Polling Interval' /proc/net/bonding/nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 13273 1726853305.74578: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 13273 1726853305.74582: stdout chunk (state=3): >>><<< 13273 1726853305.74584: stderr chunk (state=3): >>><<< 13273 1726853305.74587: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "MII Polling Interval (ms): 110", "stderr": "", "rc": 0, "cmd": ["grep", "Polling Interval", "/proc/net/bonding/nm-bond"], "start": "2024-09-20 13:28:25.722203", "end": "2024-09-20 13:28:25.726422", "delta": "0:00:00.004219", "msg": "", "invocation": {"module_args": {"_raw_params": "grep 'Polling Interval' /proc/net/bonding/nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 13273 1726853305.74589: done with _execute_module (ansible.legacy.command, {'_raw_params': "grep 'Polling Interval' /proc/net/bonding/nm-bond", '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853305.455802-14410-222652350738004/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13273 1726853305.74592: _low_level_execute_command(): starting 13273 1726853305.74594: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853305.455802-14410-222652350738004/ > /dev/null 2>&1 && sleep 0' 13273 1726853305.75530: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853305.75538: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13273 1726853305.75558: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853305.75561: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853305.75577: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853305.75591: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13273 1726853305.75669: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853305.75793: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853305.75928: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853305.77864: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853305.77881: stdout chunk (state=3): >>><<< 13273 1726853305.77890: stderr chunk (state=3): >>><<< 13273 1726853305.77907: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853305.77924: handler run complete 13273 1726853305.77950: Evaluated conditional (False): False 13273 1726853305.78189: variable 'result' from source: unknown 13273 1726853305.78192: Evaluated conditional ('110' in result.stdout): True 13273 1726853305.78195: attempt loop complete, returning result 13273 1726853305.78197: _execute() done 13273 1726853305.78199: dumping result to json 13273 1726853305.78221: done dumping result, returning 13273 1726853305.78236: done running TaskExecutor() for managed_node3/TASK: ** TEST check polling interval [02083763-bbaf-5fc3-657d-000000000071] 13273 1726853305.78276: sending task result for task 02083763-bbaf-5fc3-657d-000000000071 13273 1726853305.78536: done sending task result for task 02083763-bbaf-5fc3-657d-000000000071 13273 1726853305.78539: WORKER PROCESS EXITING ok: [managed_node3] => { "attempts": 1, "changed": false, "cmd": [ "grep", "Polling Interval", "/proc/net/bonding/nm-bond" ], "delta": "0:00:00.004219", "end": "2024-09-20 13:28:25.726422", "rc": 0, "start": "2024-09-20 13:28:25.722203" } STDOUT: MII Polling Interval (ms): 110 13273 1726853305.78637: no more pending results, returning what we have 13273 1726853305.78642: results queue empty 13273 1726853305.78643: checking for any_errors_fatal 13273 1726853305.78651: done checking for any_errors_fatal 13273 1726853305.78652: checking for max_fail_percentage 13273 1726853305.78654: done checking for max_fail_percentage 13273 1726853305.78654: checking to see if all hosts have failed and the running result is not ok 13273 1726853305.78655: done checking to see if all hosts have failed 13273 1726853305.78656: getting the remaining hosts for this loop 13273 1726853305.78657: done getting the remaining hosts for this loop 13273 1726853305.78661: getting the next task for host managed_node3 13273 1726853305.78666: done getting next task for host managed_node3 13273 1726853305.78669: ^ task is: TASK: ** TEST check IPv4 13273 1726853305.78779: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853305.78790: getting variables 13273 1726853305.78792: in VariableManager get_vars() 13273 1726853305.78850: Calling all_inventory to load vars for managed_node3 13273 1726853305.78853: Calling groups_inventory to load vars for managed_node3 13273 1726853305.78856: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853305.78868: Calling all_plugins_play to load vars for managed_node3 13273 1726853305.79185: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853305.79191: Calling groups_plugins_play to load vars for managed_node3 13273 1726853305.81284: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853305.82951: done with get_vars() 13273 1726853305.82977: done getting variables 13273 1726853305.83039: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [** TEST check IPv4] ****************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:80 Friday 20 September 2024 13:28:25 -0400 (0:00:00.415) 0:00:23.719 ****** 13273 1726853305.83067: entering _queue_task() for managed_node3/command 13273 1726853305.83391: worker is 1 (out of 1 available) 13273 1726853305.83403: exiting _queue_task() for managed_node3/command 13273 1726853305.83415: done queuing things up, now waiting for results queue to drain 13273 1726853305.83416: waiting for pending results... 13273 1726853305.83878: running TaskExecutor() for managed_node3/TASK: ** TEST check IPv4 13273 1726853305.83884: in run() - task 02083763-bbaf-5fc3-657d-000000000072 13273 1726853305.83886: variable 'ansible_search_path' from source: unknown 13273 1726853305.83890: calling self._execute() 13273 1726853305.83966: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853305.84081: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853305.84085: variable 'omit' from source: magic vars 13273 1726853305.84533: variable 'ansible_distribution_major_version' from source: facts 13273 1726853305.84549: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853305.84560: variable 'omit' from source: magic vars 13273 1726853305.84586: variable 'omit' from source: magic vars 13273 1726853305.84732: variable 'controller_device' from source: play vars 13273 1726853305.84740: variable 'omit' from source: magic vars 13273 1726853305.84767: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13273 1726853305.84808: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13273 1726853305.84832: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13273 1726853305.84864: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853305.84882: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853305.84949: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13273 1726853305.84956: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853305.84959: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853305.85037: Set connection var ansible_connection to ssh 13273 1726853305.85058: Set connection var ansible_module_compression to ZIP_DEFLATED 13273 1726853305.85077: Set connection var ansible_shell_executable to /bin/sh 13273 1726853305.85085: Set connection var ansible_shell_type to sh 13273 1726853305.85165: Set connection var ansible_pipelining to False 13273 1726853305.85170: Set connection var ansible_timeout to 10 13273 1726853305.85178: variable 'ansible_shell_executable' from source: unknown 13273 1726853305.85180: variable 'ansible_connection' from source: unknown 13273 1726853305.85183: variable 'ansible_module_compression' from source: unknown 13273 1726853305.85185: variable 'ansible_shell_type' from source: unknown 13273 1726853305.85187: variable 'ansible_shell_executable' from source: unknown 13273 1726853305.85189: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853305.85192: variable 'ansible_pipelining' from source: unknown 13273 1726853305.85194: variable 'ansible_timeout' from source: unknown 13273 1726853305.85195: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853305.85351: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13273 1726853305.85367: variable 'omit' from source: magic vars 13273 1726853305.85379: starting attempt loop 13273 1726853305.85394: running the handler 13273 1726853305.85417: _low_level_execute_command(): starting 13273 1726853305.85476: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13273 1726853305.86380: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13273 1726853305.86420: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration <<< 13273 1726853305.86502: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853305.86523: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853305.86630: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853305.88340: stdout chunk (state=3): >>>/root <<< 13273 1726853305.88495: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853305.88499: stdout chunk (state=3): >>><<< 13273 1726853305.88501: stderr chunk (state=3): >>><<< 13273 1726853305.88520: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853305.88618: _low_level_execute_command(): starting 13273 1726853305.88623: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853305.885272-14430-186851081017323 `" && echo ansible-tmp-1726853305.885272-14430-186851081017323="` echo /root/.ansible/tmp/ansible-tmp-1726853305.885272-14430-186851081017323 `" ) && sleep 0' 13273 1726853305.89142: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13273 1726853305.89156: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853305.89175: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853305.89193: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13273 1726853305.89216: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 13273 1726853305.89227: stderr chunk (state=3): >>>debug2: match not found <<< 13273 1726853305.89285: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853305.89342: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853305.89364: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853305.89378: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853305.89474: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853305.91557: stdout chunk (state=3): >>>ansible-tmp-1726853305.885272-14430-186851081017323=/root/.ansible/tmp/ansible-tmp-1726853305.885272-14430-186851081017323 <<< 13273 1726853305.91588: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853305.91591: stdout chunk (state=3): >>><<< 13273 1726853305.91594: stderr chunk (state=3): >>><<< 13273 1726853305.91782: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853305.885272-14430-186851081017323=/root/.ansible/tmp/ansible-tmp-1726853305.885272-14430-186851081017323 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853305.91785: variable 'ansible_module_compression' from source: unknown 13273 1726853305.91976: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13273pukop8ph/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 13273 1726853305.91980: variable 'ansible_facts' from source: unknown 13273 1726853305.92110: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853305.885272-14430-186851081017323/AnsiballZ_command.py 13273 1726853305.92485: Sending initial data 13273 1726853305.92488: Sent initial data (155 bytes) 13273 1726853305.93588: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853305.93812: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853305.93870: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853305.95499: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 13273 1726853305.95603: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13273 1726853305.95651: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13273 1726853305.96006: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13273pukop8ph/tmpctu5yee8 /root/.ansible/tmp/ansible-tmp-1726853305.885272-14430-186851081017323/AnsiballZ_command.py <<< 13273 1726853305.96016: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853305.885272-14430-186851081017323/AnsiballZ_command.py" <<< 13273 1726853305.96095: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13273pukop8ph/tmpctu5yee8" to remote "/root/.ansible/tmp/ansible-tmp-1726853305.885272-14430-186851081017323/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853305.885272-14430-186851081017323/AnsiballZ_command.py" <<< 13273 1726853305.97050: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853305.97105: stderr chunk (state=3): >>><<< 13273 1726853305.97109: stdout chunk (state=3): >>><<< 13273 1726853305.97219: done transferring module to remote 13273 1726853305.97231: _low_level_execute_command(): starting 13273 1726853305.97241: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853305.885272-14430-186851081017323/ /root/.ansible/tmp/ansible-tmp-1726853305.885272-14430-186851081017323/AnsiballZ_command.py && sleep 0' 13273 1726853305.97922: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13273 1726853305.97934: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853305.97950: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853305.98013: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853305.98066: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853305.98081: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853305.98186: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853305.98218: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853305.98274: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853306.00195: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853306.00198: stdout chunk (state=3): >>><<< 13273 1726853306.00201: stderr chunk (state=3): >>><<< 13273 1726853306.00216: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853306.00276: _low_level_execute_command(): starting 13273 1726853306.00280: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853305.885272-14430-186851081017323/AnsiballZ_command.py && sleep 0' 13273 1726853306.00827: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13273 1726853306.00845: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853306.00861: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853306.00888: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13273 1726853306.00900: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853306.00917: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853306.00957: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853306.00975: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853306.01053: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853306.17059: stdout chunk (state=3): >>> {"changed": true, "stdout": "24: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet 192.0.2.225/24 brd 192.0.2.255 scope global dynamic noprefixroute nm-bond\n valid_lft 235sec preferred_lft 235sec", "stderr": "", "rc": 0, "cmd": ["ip", "-4", "a", "s", "nm-bond"], "start": "2024-09-20 13:28:26.165591", "end": "2024-09-20 13:28:26.169387", "delta": "0:00:00.003796", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -4 a s nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 13273 1726853306.18852: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 13273 1726853306.18856: stdout chunk (state=3): >>><<< 13273 1726853306.18859: stderr chunk (state=3): >>><<< 13273 1726853306.18861: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "24: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet 192.0.2.225/24 brd 192.0.2.255 scope global dynamic noprefixroute nm-bond\n valid_lft 235sec preferred_lft 235sec", "stderr": "", "rc": 0, "cmd": ["ip", "-4", "a", "s", "nm-bond"], "start": "2024-09-20 13:28:26.165591", "end": "2024-09-20 13:28:26.169387", "delta": "0:00:00.003796", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -4 a s nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 13273 1726853306.18864: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip -4 a s nm-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853305.885272-14430-186851081017323/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13273 1726853306.18867: _low_level_execute_command(): starting 13273 1726853306.18877: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853305.885272-14430-186851081017323/ > /dev/null 2>&1 && sleep 0' 13273 1726853306.19444: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853306.19460: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853306.19493: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853306.19535: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853306.19542: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853306.19557: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853306.19622: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853306.21689: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853306.21693: stdout chunk (state=3): >>><<< 13273 1726853306.21695: stderr chunk (state=3): >>><<< 13273 1726853306.21698: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853306.21700: handler run complete 13273 1726853306.21702: Evaluated conditional (False): False 13273 1726853306.21759: variable 'result' from source: set_fact 13273 1726853306.21786: Evaluated conditional ('192.0.2' in result.stdout): True 13273 1726853306.21798: attempt loop complete, returning result 13273 1726853306.21801: _execute() done 13273 1726853306.21804: dumping result to json 13273 1726853306.21809: done dumping result, returning 13273 1726853306.21817: done running TaskExecutor() for managed_node3/TASK: ** TEST check IPv4 [02083763-bbaf-5fc3-657d-000000000072] 13273 1726853306.21822: sending task result for task 02083763-bbaf-5fc3-657d-000000000072 ok: [managed_node3] => { "attempts": 1, "changed": false, "cmd": [ "ip", "-4", "a", "s", "nm-bond" ], "delta": "0:00:00.003796", "end": "2024-09-20 13:28:26.169387", "rc": 0, "start": "2024-09-20 13:28:26.165591" } STDOUT: 24: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000 inet 192.0.2.225/24 brd 192.0.2.255 scope global dynamic noprefixroute nm-bond valid_lft 235sec preferred_lft 235sec 13273 1726853306.22015: no more pending results, returning what we have 13273 1726853306.22019: results queue empty 13273 1726853306.22020: checking for any_errors_fatal 13273 1726853306.22030: done checking for any_errors_fatal 13273 1726853306.22030: checking for max_fail_percentage 13273 1726853306.22032: done checking for max_fail_percentage 13273 1726853306.22033: checking to see if all hosts have failed and the running result is not ok 13273 1726853306.22034: done checking to see if all hosts have failed 13273 1726853306.22035: getting the remaining hosts for this loop 13273 1726853306.22036: done getting the remaining hosts for this loop 13273 1726853306.22039: getting the next task for host managed_node3 13273 1726853306.22047: done getting next task for host managed_node3 13273 1726853306.22050: ^ task is: TASK: ** TEST check IPv6 13273 1726853306.22052: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853306.22056: getting variables 13273 1726853306.22058: in VariableManager get_vars() 13273 1726853306.22125: Calling all_inventory to load vars for managed_node3 13273 1726853306.22129: Calling groups_inventory to load vars for managed_node3 13273 1726853306.22132: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853306.22302: Calling all_plugins_play to load vars for managed_node3 13273 1726853306.22307: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853306.22312: Calling groups_plugins_play to load vars for managed_node3 13273 1726853306.22833: done sending task result for task 02083763-bbaf-5fc3-657d-000000000072 13273 1726853306.22838: WORKER PROCESS EXITING 13273 1726853306.23368: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853306.24207: done with get_vars() 13273 1726853306.24223: done getting variables 13273 1726853306.24269: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [** TEST check IPv6] ****************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:87 Friday 20 September 2024 13:28:26 -0400 (0:00:00.412) 0:00:24.132 ****** 13273 1726853306.24293: entering _queue_task() for managed_node3/command 13273 1726853306.24529: worker is 1 (out of 1 available) 13273 1726853306.24543: exiting _queue_task() for managed_node3/command 13273 1726853306.24555: done queuing things up, now waiting for results queue to drain 13273 1726853306.24556: waiting for pending results... 13273 1726853306.24732: running TaskExecutor() for managed_node3/TASK: ** TEST check IPv6 13273 1726853306.24799: in run() - task 02083763-bbaf-5fc3-657d-000000000073 13273 1726853306.24808: variable 'ansible_search_path' from source: unknown 13273 1726853306.24838: calling self._execute() 13273 1726853306.24919: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853306.24924: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853306.24931: variable 'omit' from source: magic vars 13273 1726853306.25216: variable 'ansible_distribution_major_version' from source: facts 13273 1726853306.25230: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853306.25234: variable 'omit' from source: magic vars 13273 1726853306.25252: variable 'omit' from source: magic vars 13273 1726853306.25319: variable 'controller_device' from source: play vars 13273 1726853306.25333: variable 'omit' from source: magic vars 13273 1726853306.25369: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13273 1726853306.25399: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13273 1726853306.25415: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13273 1726853306.25430: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853306.25439: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853306.25467: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13273 1726853306.25470: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853306.25474: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853306.25536: Set connection var ansible_connection to ssh 13273 1726853306.25546: Set connection var ansible_module_compression to ZIP_DEFLATED 13273 1726853306.25554: Set connection var ansible_shell_executable to /bin/sh 13273 1726853306.25557: Set connection var ansible_shell_type to sh 13273 1726853306.25565: Set connection var ansible_pipelining to False 13273 1726853306.25575: Set connection var ansible_timeout to 10 13273 1726853306.25594: variable 'ansible_shell_executable' from source: unknown 13273 1726853306.25598: variable 'ansible_connection' from source: unknown 13273 1726853306.25601: variable 'ansible_module_compression' from source: unknown 13273 1726853306.25603: variable 'ansible_shell_type' from source: unknown 13273 1726853306.25606: variable 'ansible_shell_executable' from source: unknown 13273 1726853306.25608: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853306.25610: variable 'ansible_pipelining' from source: unknown 13273 1726853306.25612: variable 'ansible_timeout' from source: unknown 13273 1726853306.25616: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853306.25721: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13273 1726853306.25739: variable 'omit' from source: magic vars 13273 1726853306.25744: starting attempt loop 13273 1726853306.25750: running the handler 13273 1726853306.25764: _low_level_execute_command(): starting 13273 1726853306.25772: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13273 1726853306.26264: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853306.26293: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853306.26299: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853306.26301: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853306.26355: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853306.26359: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853306.26430: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853306.28129: stdout chunk (state=3): >>>/root <<< 13273 1726853306.28235: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853306.28266: stderr chunk (state=3): >>><<< 13273 1726853306.28269: stdout chunk (state=3): >>><<< 13273 1726853306.28293: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853306.28304: _low_level_execute_command(): starting 13273 1726853306.28312: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853306.2829187-14462-77351863295452 `" && echo ansible-tmp-1726853306.2829187-14462-77351863295452="` echo /root/.ansible/tmp/ansible-tmp-1726853306.2829187-14462-77351863295452 `" ) && sleep 0' 13273 1726853306.28740: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853306.28747: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13273 1726853306.28787: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 13273 1726853306.28790: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration <<< 13273 1726853306.28799: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13273 1726853306.28801: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853306.28843: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853306.28846: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853306.28852: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853306.28912: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853306.30880: stdout chunk (state=3): >>>ansible-tmp-1726853306.2829187-14462-77351863295452=/root/.ansible/tmp/ansible-tmp-1726853306.2829187-14462-77351863295452 <<< 13273 1726853306.30985: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853306.31012: stderr chunk (state=3): >>><<< 13273 1726853306.31016: stdout chunk (state=3): >>><<< 13273 1726853306.31034: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853306.2829187-14462-77351863295452=/root/.ansible/tmp/ansible-tmp-1726853306.2829187-14462-77351863295452 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853306.31063: variable 'ansible_module_compression' from source: unknown 13273 1726853306.31109: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13273pukop8ph/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 13273 1726853306.31142: variable 'ansible_facts' from source: unknown 13273 1726853306.31201: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853306.2829187-14462-77351863295452/AnsiballZ_command.py 13273 1726853306.31305: Sending initial data 13273 1726853306.31309: Sent initial data (155 bytes) 13273 1726853306.31736: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853306.31744: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13273 1726853306.31773: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853306.31777: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853306.31780: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853306.31833: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853306.31840: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853306.31842: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853306.31902: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853306.33528: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 13273 1726853306.33533: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13273 1726853306.33591: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13273 1726853306.33652: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13273pukop8ph/tmpm_srkyxt /root/.ansible/tmp/ansible-tmp-1726853306.2829187-14462-77351863295452/AnsiballZ_command.py <<< 13273 1726853306.33658: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853306.2829187-14462-77351863295452/AnsiballZ_command.py" <<< 13273 1726853306.33712: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory <<< 13273 1726853306.33716: stderr chunk (state=3): >>>debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13273pukop8ph/tmpm_srkyxt" to remote "/root/.ansible/tmp/ansible-tmp-1726853306.2829187-14462-77351863295452/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853306.2829187-14462-77351863295452/AnsiballZ_command.py" <<< 13273 1726853306.34338: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853306.34383: stderr chunk (state=3): >>><<< 13273 1726853306.34386: stdout chunk (state=3): >>><<< 13273 1726853306.34425: done transferring module to remote 13273 1726853306.34434: _low_level_execute_command(): starting 13273 1726853306.34439: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853306.2829187-14462-77351863295452/ /root/.ansible/tmp/ansible-tmp-1726853306.2829187-14462-77351863295452/AnsiballZ_command.py && sleep 0' 13273 1726853306.34870: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853306.34907: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13273 1726853306.34910: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 13273 1726853306.34914: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853306.34919: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853306.34925: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 13273 1726853306.34927: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853306.34975: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853306.34981: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853306.34983: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853306.35049: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853306.36862: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853306.36886: stderr chunk (state=3): >>><<< 13273 1726853306.36889: stdout chunk (state=3): >>><<< 13273 1726853306.36903: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853306.36906: _low_level_execute_command(): starting 13273 1726853306.36912: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853306.2829187-14462-77351863295452/AnsiballZ_command.py && sleep 0' 13273 1726853306.37348: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853306.37352: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853306.37354: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853306.37356: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 13273 1726853306.37358: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853306.37398: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853306.37402: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853306.37476: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853306.53315: stdout chunk (state=3): >>> {"changed": true, "stdout": "24: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet6 2001:db8::86/128 scope global dynamic noprefixroute \n valid_lft 236sec preferred_lft 236sec\n inet6 2001:db8::380b:9dff:fef3:469/64 scope global dynamic noprefixroute \n valid_lft 1798sec preferred_lft 1798sec\n inet6 fe80::380b:9dff:fef3:469/64 scope link noprefixroute \n valid_lft forever preferred_lft forever", "stderr": "", "rc": 0, "cmd": ["ip", "-6", "a", "s", "nm-bond"], "start": "2024-09-20 13:28:26.527963", "end": "2024-09-20 13:28:26.531863", "delta": "0:00:00.003900", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -6 a s nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 13273 1726853306.55104: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 13273 1726853306.55108: stdout chunk (state=3): >>><<< 13273 1726853306.55110: stderr chunk (state=3): >>><<< 13273 1726853306.55113: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "24: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet6 2001:db8::86/128 scope global dynamic noprefixroute \n valid_lft 236sec preferred_lft 236sec\n inet6 2001:db8::380b:9dff:fef3:469/64 scope global dynamic noprefixroute \n valid_lft 1798sec preferred_lft 1798sec\n inet6 fe80::380b:9dff:fef3:469/64 scope link noprefixroute \n valid_lft forever preferred_lft forever", "stderr": "", "rc": 0, "cmd": ["ip", "-6", "a", "s", "nm-bond"], "start": "2024-09-20 13:28:26.527963", "end": "2024-09-20 13:28:26.531863", "delta": "0:00:00.003900", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -6 a s nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 13273 1726853306.55133: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip -6 a s nm-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853306.2829187-14462-77351863295452/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13273 1726853306.55141: _low_level_execute_command(): starting 13273 1726853306.55149: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853306.2829187-14462-77351863295452/ > /dev/null 2>&1 && sleep 0' 13273 1726853306.56063: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853306.56068: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853306.56073: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853306.56076: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853306.56127: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853306.58323: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853306.58327: stdout chunk (state=3): >>><<< 13273 1726853306.58329: stderr chunk (state=3): >>><<< 13273 1726853306.58332: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853306.58334: handler run complete 13273 1726853306.58337: Evaluated conditional (False): False 13273 1726853306.58512: variable 'result' from source: set_fact 13273 1726853306.58548: Evaluated conditional ('2001' in result.stdout): True 13273 1726853306.58560: attempt loop complete, returning result 13273 1726853306.58563: _execute() done 13273 1726853306.58566: dumping result to json 13273 1726853306.58575: done dumping result, returning 13273 1726853306.58582: done running TaskExecutor() for managed_node3/TASK: ** TEST check IPv6 [02083763-bbaf-5fc3-657d-000000000073] 13273 1726853306.58587: sending task result for task 02083763-bbaf-5fc3-657d-000000000073 13273 1726853306.58826: done sending task result for task 02083763-bbaf-5fc3-657d-000000000073 13273 1726853306.58829: WORKER PROCESS EXITING ok: [managed_node3] => { "attempts": 1, "changed": false, "cmd": [ "ip", "-6", "a", "s", "nm-bond" ], "delta": "0:00:00.003900", "end": "2024-09-20 13:28:26.531863", "rc": 0, "start": "2024-09-20 13:28:26.527963" } STDOUT: 24: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000 inet6 2001:db8::86/128 scope global dynamic noprefixroute valid_lft 236sec preferred_lft 236sec inet6 2001:db8::380b:9dff:fef3:469/64 scope global dynamic noprefixroute valid_lft 1798sec preferred_lft 1798sec inet6 fe80::380b:9dff:fef3:469/64 scope link noprefixroute valid_lft forever preferred_lft forever 13273 1726853306.58906: no more pending results, returning what we have 13273 1726853306.58909: results queue empty 13273 1726853306.58910: checking for any_errors_fatal 13273 1726853306.58917: done checking for any_errors_fatal 13273 1726853306.58917: checking for max_fail_percentage 13273 1726853306.58919: done checking for max_fail_percentage 13273 1726853306.58919: checking to see if all hosts have failed and the running result is not ok 13273 1726853306.58920: done checking to see if all hosts have failed 13273 1726853306.58920: getting the remaining hosts for this loop 13273 1726853306.58922: done getting the remaining hosts for this loop 13273 1726853306.58924: getting the next task for host managed_node3 13273 1726853306.58931: done getting next task for host managed_node3 13273 1726853306.58934: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 13273 1726853306.58937: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853306.58964: getting variables 13273 1726853306.58965: in VariableManager get_vars() 13273 1726853306.59012: Calling all_inventory to load vars for managed_node3 13273 1726853306.59015: Calling groups_inventory to load vars for managed_node3 13273 1726853306.59017: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853306.59027: Calling all_plugins_play to load vars for managed_node3 13273 1726853306.59030: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853306.59033: Calling groups_plugins_play to load vars for managed_node3 13273 1726853306.60417: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853306.62066: done with get_vars() 13273 1726853306.62103: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 13:28:26 -0400 (0:00:00.379) 0:00:24.511 ****** 13273 1726853306.62206: entering _queue_task() for managed_node3/include_tasks 13273 1726853306.62582: worker is 1 (out of 1 available) 13273 1726853306.62593: exiting _queue_task() for managed_node3/include_tasks 13273 1726853306.62605: done queuing things up, now waiting for results queue to drain 13273 1726853306.62606: waiting for pending results... 13273 1726853306.62992: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 13273 1726853306.63066: in run() - task 02083763-bbaf-5fc3-657d-00000000007b 13273 1726853306.63096: variable 'ansible_search_path' from source: unknown 13273 1726853306.63103: variable 'ansible_search_path' from source: unknown 13273 1726853306.63152: calling self._execute() 13273 1726853306.63256: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853306.63270: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853306.63288: variable 'omit' from source: magic vars 13273 1726853306.63704: variable 'ansible_distribution_major_version' from source: facts 13273 1726853306.63722: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853306.63740: _execute() done 13273 1726853306.63752: dumping result to json 13273 1726853306.63760: done dumping result, returning 13273 1726853306.63772: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [02083763-bbaf-5fc3-657d-00000000007b] 13273 1726853306.63784: sending task result for task 02083763-bbaf-5fc3-657d-00000000007b 13273 1726853306.64048: no more pending results, returning what we have 13273 1726853306.64053: in VariableManager get_vars() 13273 1726853306.64123: Calling all_inventory to load vars for managed_node3 13273 1726853306.64127: Calling groups_inventory to load vars for managed_node3 13273 1726853306.64129: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853306.64146: Calling all_plugins_play to load vars for managed_node3 13273 1726853306.64149: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853306.64152: Calling groups_plugins_play to load vars for managed_node3 13273 1726853306.64689: done sending task result for task 02083763-bbaf-5fc3-657d-00000000007b 13273 1726853306.64693: WORKER PROCESS EXITING 13273 1726853306.65928: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853306.67363: done with get_vars() 13273 1726853306.67382: variable 'ansible_search_path' from source: unknown 13273 1726853306.67383: variable 'ansible_search_path' from source: unknown 13273 1726853306.67414: we have included files to process 13273 1726853306.67415: generating all_blocks data 13273 1726853306.67416: done generating all_blocks data 13273 1726853306.67420: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 13273 1726853306.67421: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 13273 1726853306.67422: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 13273 1726853306.67807: done processing included file 13273 1726853306.67808: iterating over new_blocks loaded from include file 13273 1726853306.67809: in VariableManager get_vars() 13273 1726853306.67831: done with get_vars() 13273 1726853306.67832: filtering new block on tags 13273 1726853306.67847: done filtering new block on tags 13273 1726853306.67849: in VariableManager get_vars() 13273 1726853306.67867: done with get_vars() 13273 1726853306.67869: filtering new block on tags 13273 1726853306.67883: done filtering new block on tags 13273 1726853306.67884: in VariableManager get_vars() 13273 1726853306.67901: done with get_vars() 13273 1726853306.67902: filtering new block on tags 13273 1726853306.67911: done filtering new block on tags 13273 1726853306.67913: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node3 13273 1726853306.67917: extending task lists for all hosts with included blocks 13273 1726853306.68362: done extending task lists 13273 1726853306.68363: done processing included files 13273 1726853306.68363: results queue empty 13273 1726853306.68364: checking for any_errors_fatal 13273 1726853306.68367: done checking for any_errors_fatal 13273 1726853306.68367: checking for max_fail_percentage 13273 1726853306.68368: done checking for max_fail_percentage 13273 1726853306.68368: checking to see if all hosts have failed and the running result is not ok 13273 1726853306.68369: done checking to see if all hosts have failed 13273 1726853306.68369: getting the remaining hosts for this loop 13273 1726853306.68370: done getting the remaining hosts for this loop 13273 1726853306.68373: getting the next task for host managed_node3 13273 1726853306.68376: done getting next task for host managed_node3 13273 1726853306.68378: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 13273 1726853306.68381: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853306.68389: getting variables 13273 1726853306.68390: in VariableManager get_vars() 13273 1726853306.68403: Calling all_inventory to load vars for managed_node3 13273 1726853306.68405: Calling groups_inventory to load vars for managed_node3 13273 1726853306.68406: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853306.68410: Calling all_plugins_play to load vars for managed_node3 13273 1726853306.68411: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853306.68413: Calling groups_plugins_play to load vars for managed_node3 13273 1726853306.69395: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853306.70823: done with get_vars() 13273 1726853306.70838: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 13:28:26 -0400 (0:00:00.086) 0:00:24.598 ****** 13273 1726853306.70895: entering _queue_task() for managed_node3/setup 13273 1726853306.71152: worker is 1 (out of 1 available) 13273 1726853306.71165: exiting _queue_task() for managed_node3/setup 13273 1726853306.71181: done queuing things up, now waiting for results queue to drain 13273 1726853306.71183: waiting for pending results... 13273 1726853306.71365: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 13273 1726853306.71474: in run() - task 02083763-bbaf-5fc3-657d-0000000006c5 13273 1726853306.71486: variable 'ansible_search_path' from source: unknown 13273 1726853306.71490: variable 'ansible_search_path' from source: unknown 13273 1726853306.71522: calling self._execute() 13273 1726853306.71594: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853306.71599: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853306.71607: variable 'omit' from source: magic vars 13273 1726853306.71891: variable 'ansible_distribution_major_version' from source: facts 13273 1726853306.71900: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853306.72044: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13273 1726853306.73926: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13273 1726853306.73982: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13273 1726853306.74013: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13273 1726853306.74039: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13273 1726853306.74060: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13273 1726853306.74121: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853306.74140: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853306.74158: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853306.74185: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853306.74196: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853306.74237: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853306.74254: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853306.74270: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853306.74296: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853306.74306: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853306.74414: variable '__network_required_facts' from source: role '' defaults 13273 1726853306.74428: variable 'ansible_facts' from source: unknown 13273 1726853306.74864: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 13273 1726853306.74868: when evaluation is False, skipping this task 13273 1726853306.74872: _execute() done 13273 1726853306.74875: dumping result to json 13273 1726853306.74877: done dumping result, returning 13273 1726853306.74882: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [02083763-bbaf-5fc3-657d-0000000006c5] 13273 1726853306.74887: sending task result for task 02083763-bbaf-5fc3-657d-0000000006c5 13273 1726853306.74977: done sending task result for task 02083763-bbaf-5fc3-657d-0000000006c5 13273 1726853306.74980: WORKER PROCESS EXITING skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 13273 1726853306.75024: no more pending results, returning what we have 13273 1726853306.75027: results queue empty 13273 1726853306.75028: checking for any_errors_fatal 13273 1726853306.75029: done checking for any_errors_fatal 13273 1726853306.75030: checking for max_fail_percentage 13273 1726853306.75031: done checking for max_fail_percentage 13273 1726853306.75032: checking to see if all hosts have failed and the running result is not ok 13273 1726853306.75032: done checking to see if all hosts have failed 13273 1726853306.75033: getting the remaining hosts for this loop 13273 1726853306.75034: done getting the remaining hosts for this loop 13273 1726853306.75038: getting the next task for host managed_node3 13273 1726853306.75048: done getting next task for host managed_node3 13273 1726853306.75051: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 13273 1726853306.75055: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853306.75075: getting variables 13273 1726853306.75077: in VariableManager get_vars() 13273 1726853306.75134: Calling all_inventory to load vars for managed_node3 13273 1726853306.75137: Calling groups_inventory to load vars for managed_node3 13273 1726853306.75140: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853306.75151: Calling all_plugins_play to load vars for managed_node3 13273 1726853306.75153: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853306.75156: Calling groups_plugins_play to load vars for managed_node3 13273 1726853306.75986: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853306.76977: done with get_vars() 13273 1726853306.77002: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 13:28:26 -0400 (0:00:00.062) 0:00:24.660 ****** 13273 1726853306.77115: entering _queue_task() for managed_node3/stat 13273 1726853306.77455: worker is 1 (out of 1 available) 13273 1726853306.77468: exiting _queue_task() for managed_node3/stat 13273 1726853306.77483: done queuing things up, now waiting for results queue to drain 13273 1726853306.77484: waiting for pending results... 13273 1726853306.77791: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree 13273 1726853306.77936: in run() - task 02083763-bbaf-5fc3-657d-0000000006c7 13273 1726853306.77957: variable 'ansible_search_path' from source: unknown 13273 1726853306.77960: variable 'ansible_search_path' from source: unknown 13273 1726853306.77986: calling self._execute() 13273 1726853306.78059: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853306.78063: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853306.78074: variable 'omit' from source: magic vars 13273 1726853306.78351: variable 'ansible_distribution_major_version' from source: facts 13273 1726853306.78360: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853306.78474: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13273 1726853306.78667: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13273 1726853306.78701: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13273 1726853306.78727: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13273 1726853306.78752: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13273 1726853306.78824: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13273 1726853306.78845: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13273 1726853306.78864: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853306.78883: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13273 1726853306.78949: variable '__network_is_ostree' from source: set_fact 13273 1726853306.78952: Evaluated conditional (not __network_is_ostree is defined): False 13273 1726853306.78955: when evaluation is False, skipping this task 13273 1726853306.78957: _execute() done 13273 1726853306.78959: dumping result to json 13273 1726853306.78963: done dumping result, returning 13273 1726853306.78973: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree [02083763-bbaf-5fc3-657d-0000000006c7] 13273 1726853306.78976: sending task result for task 02083763-bbaf-5fc3-657d-0000000006c7 13273 1726853306.79056: done sending task result for task 02083763-bbaf-5fc3-657d-0000000006c7 13273 1726853306.79059: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 13273 1726853306.79123: no more pending results, returning what we have 13273 1726853306.79126: results queue empty 13273 1726853306.79127: checking for any_errors_fatal 13273 1726853306.79134: done checking for any_errors_fatal 13273 1726853306.79135: checking for max_fail_percentage 13273 1726853306.79137: done checking for max_fail_percentage 13273 1726853306.79137: checking to see if all hosts have failed and the running result is not ok 13273 1726853306.79138: done checking to see if all hosts have failed 13273 1726853306.79139: getting the remaining hosts for this loop 13273 1726853306.79140: done getting the remaining hosts for this loop 13273 1726853306.79146: getting the next task for host managed_node3 13273 1726853306.79151: done getting next task for host managed_node3 13273 1726853306.79154: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 13273 1726853306.79158: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853306.79177: getting variables 13273 1726853306.79179: in VariableManager get_vars() 13273 1726853306.79222: Calling all_inventory to load vars for managed_node3 13273 1726853306.79225: Calling groups_inventory to load vars for managed_node3 13273 1726853306.79227: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853306.79235: Calling all_plugins_play to load vars for managed_node3 13273 1726853306.79237: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853306.79239: Calling groups_plugins_play to load vars for managed_node3 13273 1726853306.80559: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853306.82047: done with get_vars() 13273 1726853306.82066: done getting variables 13273 1726853306.82112: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 13:28:26 -0400 (0:00:00.050) 0:00:24.710 ****** 13273 1726853306.82137: entering _queue_task() for managed_node3/set_fact 13273 1726853306.82401: worker is 1 (out of 1 available) 13273 1726853306.82415: exiting _queue_task() for managed_node3/set_fact 13273 1726853306.82427: done queuing things up, now waiting for results queue to drain 13273 1726853306.82428: waiting for pending results... 13273 1726853306.82610: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 13273 1726853306.82716: in run() - task 02083763-bbaf-5fc3-657d-0000000006c8 13273 1726853306.82729: variable 'ansible_search_path' from source: unknown 13273 1726853306.82732: variable 'ansible_search_path' from source: unknown 13273 1726853306.82762: calling self._execute() 13273 1726853306.82830: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853306.82834: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853306.82846: variable 'omit' from source: magic vars 13273 1726853306.83126: variable 'ansible_distribution_major_version' from source: facts 13273 1726853306.83136: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853306.83277: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13273 1726853306.83447: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13273 1726853306.83479: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13273 1726853306.83504: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13273 1726853306.83529: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13273 1726853306.83594: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13273 1726853306.83611: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13273 1726853306.83630: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853306.83652: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13273 1726853306.83716: variable '__network_is_ostree' from source: set_fact 13273 1726853306.83722: Evaluated conditional (not __network_is_ostree is defined): False 13273 1726853306.83725: when evaluation is False, skipping this task 13273 1726853306.83728: _execute() done 13273 1726853306.83730: dumping result to json 13273 1726853306.83734: done dumping result, returning 13273 1726853306.83741: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [02083763-bbaf-5fc3-657d-0000000006c8] 13273 1726853306.83746: sending task result for task 02083763-bbaf-5fc3-657d-0000000006c8 13273 1726853306.83857: done sending task result for task 02083763-bbaf-5fc3-657d-0000000006c8 13273 1726853306.83860: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 13273 1726853306.84020: no more pending results, returning what we have 13273 1726853306.84024: results queue empty 13273 1726853306.84025: checking for any_errors_fatal 13273 1726853306.84030: done checking for any_errors_fatal 13273 1726853306.84031: checking for max_fail_percentage 13273 1726853306.84032: done checking for max_fail_percentage 13273 1726853306.84033: checking to see if all hosts have failed and the running result is not ok 13273 1726853306.84034: done checking to see if all hosts have failed 13273 1726853306.84034: getting the remaining hosts for this loop 13273 1726853306.84036: done getting the remaining hosts for this loop 13273 1726853306.84039: getting the next task for host managed_node3 13273 1726853306.84050: done getting next task for host managed_node3 13273 1726853306.84054: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 13273 1726853306.84059: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853306.84081: getting variables 13273 1726853306.84084: in VariableManager get_vars() 13273 1726853306.84138: Calling all_inventory to load vars for managed_node3 13273 1726853306.84141: Calling groups_inventory to load vars for managed_node3 13273 1726853306.84147: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853306.84156: Calling all_plugins_play to load vars for managed_node3 13273 1726853306.84160: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853306.84163: Calling groups_plugins_play to load vars for managed_node3 13273 1726853306.85352: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853306.86216: done with get_vars() 13273 1726853306.86235: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 13:28:26 -0400 (0:00:00.041) 0:00:24.752 ****** 13273 1726853306.86310: entering _queue_task() for managed_node3/service_facts 13273 1726853306.86562: worker is 1 (out of 1 available) 13273 1726853306.86577: exiting _queue_task() for managed_node3/service_facts 13273 1726853306.86590: done queuing things up, now waiting for results queue to drain 13273 1726853306.86591: waiting for pending results... 13273 1726853306.86770: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running 13273 1726853306.86859: in run() - task 02083763-bbaf-5fc3-657d-0000000006ca 13273 1726853306.86873: variable 'ansible_search_path' from source: unknown 13273 1726853306.86877: variable 'ansible_search_path' from source: unknown 13273 1726853306.86904: calling self._execute() 13273 1726853306.86994: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853306.86998: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853306.87007: variable 'omit' from source: magic vars 13273 1726853306.87507: variable 'ansible_distribution_major_version' from source: facts 13273 1726853306.87511: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853306.87513: variable 'omit' from source: magic vars 13273 1726853306.87526: variable 'omit' from source: magic vars 13273 1726853306.87569: variable 'omit' from source: magic vars 13273 1726853306.87621: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13273 1726853306.87666: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13273 1726853306.87696: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13273 1726853306.87722: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853306.87742: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853306.87834: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13273 1726853306.87840: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853306.87845: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853306.87903: Set connection var ansible_connection to ssh 13273 1726853306.87918: Set connection var ansible_module_compression to ZIP_DEFLATED 13273 1726853306.87926: Set connection var ansible_shell_executable to /bin/sh 13273 1726853306.87932: Set connection var ansible_shell_type to sh 13273 1726853306.87953: Set connection var ansible_pipelining to False 13273 1726853306.87962: Set connection var ansible_timeout to 10 13273 1726853306.87995: variable 'ansible_shell_executable' from source: unknown 13273 1726853306.88002: variable 'ansible_connection' from source: unknown 13273 1726853306.88051: variable 'ansible_module_compression' from source: unknown 13273 1726853306.88059: variable 'ansible_shell_type' from source: unknown 13273 1726853306.88062: variable 'ansible_shell_executable' from source: unknown 13273 1726853306.88064: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853306.88066: variable 'ansible_pipelining' from source: unknown 13273 1726853306.88068: variable 'ansible_timeout' from source: unknown 13273 1726853306.88070: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853306.88273: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 13273 1726853306.88297: variable 'omit' from source: magic vars 13273 1726853306.88300: starting attempt loop 13273 1726853306.88303: running the handler 13273 1726853306.88315: _low_level_execute_command(): starting 13273 1726853306.88321: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13273 1726853306.88839: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853306.88843: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853306.88847: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853306.88894: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853306.88898: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853306.88973: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853306.90701: stdout chunk (state=3): >>>/root <<< 13273 1726853306.90797: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853306.90836: stderr chunk (state=3): >>><<< 13273 1726853306.90838: stdout chunk (state=3): >>><<< 13273 1726853306.90851: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853306.90879: _low_level_execute_command(): starting 13273 1726853306.90882: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853306.908554-14493-194654975557152 `" && echo ansible-tmp-1726853306.908554-14493-194654975557152="` echo /root/.ansible/tmp/ansible-tmp-1726853306.908554-14493-194654975557152 `" ) && sleep 0' 13273 1726853306.91435: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853306.91467: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853306.91484: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853306.91677: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853306.93616: stdout chunk (state=3): >>>ansible-tmp-1726853306.908554-14493-194654975557152=/root/.ansible/tmp/ansible-tmp-1726853306.908554-14493-194654975557152 <<< 13273 1726853306.93740: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853306.93747: stdout chunk (state=3): >>><<< 13273 1726853306.93750: stderr chunk (state=3): >>><<< 13273 1726853306.93765: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853306.908554-14493-194654975557152=/root/.ansible/tmp/ansible-tmp-1726853306.908554-14493-194654975557152 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853306.93809: variable 'ansible_module_compression' from source: unknown 13273 1726853306.93848: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13273pukop8ph/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 13273 1726853306.93883: variable 'ansible_facts' from source: unknown 13273 1726853306.93941: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853306.908554-14493-194654975557152/AnsiballZ_service_facts.py 13273 1726853306.94043: Sending initial data 13273 1726853306.94047: Sent initial data (161 bytes) 13273 1726853306.95402: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853306.95436: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853306.95454: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853306.95486: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853306.95597: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853306.97422: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13273 1726853306.97478: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13273 1726853306.97542: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13273pukop8ph/tmpv1wd3yrj /root/.ansible/tmp/ansible-tmp-1726853306.908554-14493-194654975557152/AnsiballZ_service_facts.py <<< 13273 1726853306.97546: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853306.908554-14493-194654975557152/AnsiballZ_service_facts.py" <<< 13273 1726853306.97601: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory <<< 13273 1726853306.97613: stderr chunk (state=3): >>>debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13273pukop8ph/tmpv1wd3yrj" to remote "/root/.ansible/tmp/ansible-tmp-1726853306.908554-14493-194654975557152/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853306.908554-14493-194654975557152/AnsiballZ_service_facts.py" <<< 13273 1726853306.99183: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853306.99187: stdout chunk (state=3): >>><<< 13273 1726853306.99190: stderr chunk (state=3): >>><<< 13273 1726853306.99195: done transferring module to remote 13273 1726853306.99213: _low_level_execute_command(): starting 13273 1726853306.99287: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853306.908554-14493-194654975557152/ /root/.ansible/tmp/ansible-tmp-1726853306.908554-14493-194654975557152/AnsiballZ_service_facts.py && sleep 0' 13273 1726853307.00630: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13273 1726853307.00649: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853307.00657: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853307.00675: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13273 1726853307.00687: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 13273 1726853307.00867: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853307.00885: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853307.00895: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853307.01102: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853307.03049: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853307.03053: stdout chunk (state=3): >>><<< 13273 1726853307.03058: stderr chunk (state=3): >>><<< 13273 1726853307.03075: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853307.03078: _low_level_execute_command(): starting 13273 1726853307.03085: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853306.908554-14493-194654975557152/AnsiballZ_service_facts.py && sleep 0' 13273 1726853307.03659: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13273 1726853307.03668: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853307.03680: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853307.03694: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13273 1726853307.03706: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 13273 1726853307.03752: stderr chunk (state=3): >>>debug2: match not found <<< 13273 1726853307.03756: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853307.03758: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13273 1726853307.03760: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address <<< 13273 1726853307.03763: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13273 1726853307.03765: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853307.03767: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853307.03769: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13273 1726853307.03784: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 13273 1726853307.03791: stderr chunk (state=3): >>>debug2: match found <<< 13273 1726853307.03801: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853307.03921: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853307.03924: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853307.03926: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853307.03989: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853308.64048: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.<<< 13273 1726853308.64068: stdout chunk (state=3): >>>service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "st<<< 13273 1726853308.64102: stdout chunk (state=3): >>>opped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 13273 1726853308.65698: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 13273 1726853308.65734: stderr chunk (state=3): >>><<< 13273 1726853308.65752: stdout chunk (state=3): >>><<< 13273 1726853308.65978: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 13273 1726853308.66877: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853306.908554-14493-194654975557152/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13273 1726853308.66892: _low_level_execute_command(): starting 13273 1726853308.66901: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853306.908554-14493-194654975557152/ > /dev/null 2>&1 && sleep 0' 13273 1726853308.67541: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13273 1726853308.67554: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853308.67570: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853308.67591: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13273 1726853308.67608: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 13273 1726853308.67620: stderr chunk (state=3): >>>debug2: match not found <<< 13273 1726853308.67641: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853308.67686: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853308.67751: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853308.67769: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853308.67793: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853308.67887: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853308.69951: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853308.69955: stdout chunk (state=3): >>><<< 13273 1726853308.69957: stderr chunk (state=3): >>><<< 13273 1726853308.69960: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853308.69962: handler run complete 13273 1726853308.70093: variable 'ansible_facts' from source: unknown 13273 1726853308.70240: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853308.70751: variable 'ansible_facts' from source: unknown 13273 1726853308.70892: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853308.71115: attempt loop complete, returning result 13273 1726853308.71125: _execute() done 13273 1726853308.71132: dumping result to json 13273 1726853308.71203: done dumping result, returning 13273 1726853308.71215: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running [02083763-bbaf-5fc3-657d-0000000006ca] 13273 1726853308.71223: sending task result for task 02083763-bbaf-5fc3-657d-0000000006ca 13273 1726853308.72477: done sending task result for task 02083763-bbaf-5fc3-657d-0000000006ca 13273 1726853308.72480: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 13273 1726853308.72603: no more pending results, returning what we have 13273 1726853308.72605: results queue empty 13273 1726853308.72606: checking for any_errors_fatal 13273 1726853308.72609: done checking for any_errors_fatal 13273 1726853308.72610: checking for max_fail_percentage 13273 1726853308.72612: done checking for max_fail_percentage 13273 1726853308.72612: checking to see if all hosts have failed and the running result is not ok 13273 1726853308.72613: done checking to see if all hosts have failed 13273 1726853308.72615: getting the remaining hosts for this loop 13273 1726853308.72617: done getting the remaining hosts for this loop 13273 1726853308.72620: getting the next task for host managed_node3 13273 1726853308.72625: done getting next task for host managed_node3 13273 1726853308.72629: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 13273 1726853308.72633: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853308.72643: getting variables 13273 1726853308.72644: in VariableManager get_vars() 13273 1726853308.72723: Calling all_inventory to load vars for managed_node3 13273 1726853308.72725: Calling groups_inventory to load vars for managed_node3 13273 1726853308.72728: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853308.72737: Calling all_plugins_play to load vars for managed_node3 13273 1726853308.72740: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853308.72743: Calling groups_plugins_play to load vars for managed_node3 13273 1726853308.74047: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853308.75663: done with get_vars() 13273 1726853308.75687: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 13:28:28 -0400 (0:00:01.894) 0:00:26.647 ****** 13273 1726853308.75790: entering _queue_task() for managed_node3/package_facts 13273 1726853308.76104: worker is 1 (out of 1 available) 13273 1726853308.76116: exiting _queue_task() for managed_node3/package_facts 13273 1726853308.76128: done queuing things up, now waiting for results queue to drain 13273 1726853308.76129: waiting for pending results... 13273 1726853308.76499: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed 13273 1726853308.76570: in run() - task 02083763-bbaf-5fc3-657d-0000000006cb 13273 1726853308.76598: variable 'ansible_search_path' from source: unknown 13273 1726853308.76606: variable 'ansible_search_path' from source: unknown 13273 1726853308.76647: calling self._execute() 13273 1726853308.76748: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853308.76761: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853308.76778: variable 'omit' from source: magic vars 13273 1726853308.77174: variable 'ansible_distribution_major_version' from source: facts 13273 1726853308.77193: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853308.77205: variable 'omit' from source: magic vars 13273 1726853308.77284: variable 'omit' from source: magic vars 13273 1726853308.77324: variable 'omit' from source: magic vars 13273 1726853308.77376: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13273 1726853308.77468: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13273 1726853308.77473: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13273 1726853308.77476: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853308.77478: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853308.77513: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13273 1726853308.77522: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853308.77530: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853308.77637: Set connection var ansible_connection to ssh 13273 1726853308.77654: Set connection var ansible_module_compression to ZIP_DEFLATED 13273 1726853308.77665: Set connection var ansible_shell_executable to /bin/sh 13273 1726853308.77675: Set connection var ansible_shell_type to sh 13273 1726853308.77692: Set connection var ansible_pipelining to False 13273 1726853308.77776: Set connection var ansible_timeout to 10 13273 1726853308.77779: variable 'ansible_shell_executable' from source: unknown 13273 1726853308.77781: variable 'ansible_connection' from source: unknown 13273 1726853308.77785: variable 'ansible_module_compression' from source: unknown 13273 1726853308.77792: variable 'ansible_shell_type' from source: unknown 13273 1726853308.77795: variable 'ansible_shell_executable' from source: unknown 13273 1726853308.77797: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853308.77800: variable 'ansible_pipelining' from source: unknown 13273 1726853308.77802: variable 'ansible_timeout' from source: unknown 13273 1726853308.77804: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853308.77985: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 13273 1726853308.78011: variable 'omit' from source: magic vars 13273 1726853308.78014: starting attempt loop 13273 1726853308.78121: running the handler 13273 1726853308.78124: _low_level_execute_command(): starting 13273 1726853308.78127: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13273 1726853308.78899: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853308.78921: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853308.78938: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853308.79037: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853308.80761: stdout chunk (state=3): >>>/root <<< 13273 1726853308.80907: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853308.80919: stdout chunk (state=3): >>><<< 13273 1726853308.80935: stderr chunk (state=3): >>><<< 13273 1726853308.80966: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853308.80993: _low_level_execute_command(): starting 13273 1726853308.81005: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853308.809776-14571-226488648880562 `" && echo ansible-tmp-1726853308.809776-14571-226488648880562="` echo /root/.ansible/tmp/ansible-tmp-1726853308.809776-14571-226488648880562 `" ) && sleep 0' 13273 1726853308.81636: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13273 1726853308.81656: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853308.81674: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853308.81694: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13273 1726853308.81713: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 13273 1726853308.81738: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853308.81799: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853308.81875: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853308.81895: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853308.81918: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853308.82019: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853308.84034: stdout chunk (state=3): >>>ansible-tmp-1726853308.809776-14571-226488648880562=/root/.ansible/tmp/ansible-tmp-1726853308.809776-14571-226488648880562 <<< 13273 1726853308.84202: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853308.84205: stdout chunk (state=3): >>><<< 13273 1726853308.84208: stderr chunk (state=3): >>><<< 13273 1726853308.84227: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853308.809776-14571-226488648880562=/root/.ansible/tmp/ansible-tmp-1726853308.809776-14571-226488648880562 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853308.84309: variable 'ansible_module_compression' from source: unknown 13273 1726853308.84347: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13273pukop8ph/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 13273 1726853308.84431: variable 'ansible_facts' from source: unknown 13273 1726853308.84676: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853308.809776-14571-226488648880562/AnsiballZ_package_facts.py 13273 1726853308.84912: Sending initial data 13273 1726853308.84915: Sent initial data (161 bytes) 13273 1726853308.85470: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13273 1726853308.85576: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853308.85590: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853308.85729: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853308.85837: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853308.85895: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853308.85988: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853308.87631: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 13273 1726853308.87674: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13273 1726853308.87736: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13273 1726853308.87808: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13273pukop8ph/tmp9j9fwqbh /root/.ansible/tmp/ansible-tmp-1726853308.809776-14571-226488648880562/AnsiballZ_package_facts.py <<< 13273 1726853308.87811: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853308.809776-14571-226488648880562/AnsiballZ_package_facts.py" <<< 13273 1726853308.87874: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13273pukop8ph/tmp9j9fwqbh" to remote "/root/.ansible/tmp/ansible-tmp-1726853308.809776-14571-226488648880562/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853308.809776-14571-226488648880562/AnsiballZ_package_facts.py" <<< 13273 1726853308.89690: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853308.89693: stderr chunk (state=3): >>><<< 13273 1726853308.89696: stdout chunk (state=3): >>><<< 13273 1726853308.89698: done transferring module to remote 13273 1726853308.89700: _low_level_execute_command(): starting 13273 1726853308.89703: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853308.809776-14571-226488648880562/ /root/.ansible/tmp/ansible-tmp-1726853308.809776-14571-226488648880562/AnsiballZ_package_facts.py && sleep 0' 13273 1726853308.90252: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13273 1726853308.90263: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853308.90360: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853308.90396: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853308.90486: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853308.92417: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853308.92421: stdout chunk (state=3): >>><<< 13273 1726853308.92424: stderr chunk (state=3): >>><<< 13273 1726853308.92439: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853308.92526: _low_level_execute_command(): starting 13273 1726853308.92530: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853308.809776-14571-226488648880562/AnsiballZ_package_facts.py && sleep 0' 13273 1726853308.93182: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853308.93186: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853308.93189: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853308.93191: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853308.93235: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853308.93247: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853308.93334: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853309.38488: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks"<<< 13273 1726853309.38513: stdout chunk (state=3): >>>: [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "rele<<< 13273 1726853309.38532: stdout chunk (state=3): >>>ase": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null,<<< 13273 1726853309.38537: stdout chunk (state=3): >>> "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certm<<< 13273 1726853309.38575: stdout chunk (state=3): >>>ap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "sou<<< 13273 1726853309.38591: stdout chunk (state=3): >>>rce": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.7<<< 13273 1726853309.38635: stdout chunk (state=3): >>>3.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-resc<<< 13273 1726853309.38641: stdout chunk (state=3): >>>ue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1<<< 13273 1726853309.38660: stdout chunk (state=3): >>>.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10<<< 13273 1726853309.38672: stdout chunk (state=3): >>>", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.<<< 13273 1726853309.38698: stdout chunk (state=3): >>>26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "c<<< 13273 1726853309.38702: stdout chunk (state=3): >>>loud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 13273 1726853309.40465: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 13273 1726853309.40496: stderr chunk (state=3): >>><<< 13273 1726853309.40500: stdout chunk (state=3): >>><<< 13273 1726853309.40541: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 13273 1726853309.41834: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853308.809776-14571-226488648880562/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13273 1726853309.41850: _low_level_execute_command(): starting 13273 1726853309.41853: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853308.809776-14571-226488648880562/ > /dev/null 2>&1 && sleep 0' 13273 1726853309.42318: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853309.42322: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853309.42324: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853309.42326: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 13273 1726853309.42328: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853309.42376: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853309.42380: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853309.42394: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853309.42452: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853309.44375: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853309.44404: stderr chunk (state=3): >>><<< 13273 1726853309.44407: stdout chunk (state=3): >>><<< 13273 1726853309.44419: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853309.44425: handler run complete 13273 1726853309.44906: variable 'ansible_facts' from source: unknown 13273 1726853309.49153: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853309.50191: variable 'ansible_facts' from source: unknown 13273 1726853309.50421: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853309.50799: attempt loop complete, returning result 13273 1726853309.50808: _execute() done 13273 1726853309.50811: dumping result to json 13273 1726853309.50927: done dumping result, returning 13273 1726853309.50934: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed [02083763-bbaf-5fc3-657d-0000000006cb] 13273 1726853309.50936: sending task result for task 02083763-bbaf-5fc3-657d-0000000006cb ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 13273 1726853309.55717: done sending task result for task 02083763-bbaf-5fc3-657d-0000000006cb 13273 1726853309.55721: WORKER PROCESS EXITING 13273 1726853309.55729: no more pending results, returning what we have 13273 1726853309.55730: results queue empty 13273 1726853309.55731: checking for any_errors_fatal 13273 1726853309.55733: done checking for any_errors_fatal 13273 1726853309.55733: checking for max_fail_percentage 13273 1726853309.55734: done checking for max_fail_percentage 13273 1726853309.55734: checking to see if all hosts have failed and the running result is not ok 13273 1726853309.55735: done checking to see if all hosts have failed 13273 1726853309.55735: getting the remaining hosts for this loop 13273 1726853309.55736: done getting the remaining hosts for this loop 13273 1726853309.55738: getting the next task for host managed_node3 13273 1726853309.55741: done getting next task for host managed_node3 13273 1726853309.55745: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 13273 1726853309.55747: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853309.55753: getting variables 13273 1726853309.55753: in VariableManager get_vars() 13273 1726853309.55775: Calling all_inventory to load vars for managed_node3 13273 1726853309.55776: Calling groups_inventory to load vars for managed_node3 13273 1726853309.55778: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853309.55782: Calling all_plugins_play to load vars for managed_node3 13273 1726853309.55783: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853309.55785: Calling groups_plugins_play to load vars for managed_node3 13273 1726853309.56404: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853309.57237: done with get_vars() 13273 1726853309.57253: done getting variables 13273 1726853309.57290: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 13:28:29 -0400 (0:00:00.815) 0:00:27.462 ****** 13273 1726853309.57317: entering _queue_task() for managed_node3/debug 13273 1726853309.57578: worker is 1 (out of 1 available) 13273 1726853309.57592: exiting _queue_task() for managed_node3/debug 13273 1726853309.57604: done queuing things up, now waiting for results queue to drain 13273 1726853309.57605: waiting for pending results... 13273 1726853309.57792: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider 13273 1726853309.57882: in run() - task 02083763-bbaf-5fc3-657d-00000000007c 13273 1726853309.57895: variable 'ansible_search_path' from source: unknown 13273 1726853309.57899: variable 'ansible_search_path' from source: unknown 13273 1726853309.57925: calling self._execute() 13273 1726853309.58005: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853309.58009: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853309.58019: variable 'omit' from source: magic vars 13273 1726853309.58308: variable 'ansible_distribution_major_version' from source: facts 13273 1726853309.58317: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853309.58323: variable 'omit' from source: magic vars 13273 1726853309.58364: variable 'omit' from source: magic vars 13273 1726853309.58433: variable 'network_provider' from source: set_fact 13273 1726853309.58450: variable 'omit' from source: magic vars 13273 1726853309.58482: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13273 1726853309.58511: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13273 1726853309.58525: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13273 1726853309.58539: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853309.58551: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853309.58575: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13273 1726853309.58578: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853309.58583: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853309.58651: Set connection var ansible_connection to ssh 13273 1726853309.58660: Set connection var ansible_module_compression to ZIP_DEFLATED 13273 1726853309.58665: Set connection var ansible_shell_executable to /bin/sh 13273 1726853309.58668: Set connection var ansible_shell_type to sh 13273 1726853309.58674: Set connection var ansible_pipelining to False 13273 1726853309.58679: Set connection var ansible_timeout to 10 13273 1726853309.58700: variable 'ansible_shell_executable' from source: unknown 13273 1726853309.58704: variable 'ansible_connection' from source: unknown 13273 1726853309.58706: variable 'ansible_module_compression' from source: unknown 13273 1726853309.58709: variable 'ansible_shell_type' from source: unknown 13273 1726853309.58711: variable 'ansible_shell_executable' from source: unknown 13273 1726853309.58716: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853309.58718: variable 'ansible_pipelining' from source: unknown 13273 1726853309.58721: variable 'ansible_timeout' from source: unknown 13273 1726853309.58723: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853309.58824: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13273 1726853309.58834: variable 'omit' from source: magic vars 13273 1726853309.58837: starting attempt loop 13273 1726853309.58840: running the handler 13273 1726853309.58880: handler run complete 13273 1726853309.58890: attempt loop complete, returning result 13273 1726853309.58893: _execute() done 13273 1726853309.58896: dumping result to json 13273 1726853309.58899: done dumping result, returning 13273 1726853309.58905: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider [02083763-bbaf-5fc3-657d-00000000007c] 13273 1726853309.58913: sending task result for task 02083763-bbaf-5fc3-657d-00000000007c 13273 1726853309.58990: done sending task result for task 02083763-bbaf-5fc3-657d-00000000007c 13273 1726853309.58993: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: Using network provider: nm 13273 1726853309.59075: no more pending results, returning what we have 13273 1726853309.59078: results queue empty 13273 1726853309.59079: checking for any_errors_fatal 13273 1726853309.59089: done checking for any_errors_fatal 13273 1726853309.59089: checking for max_fail_percentage 13273 1726853309.59091: done checking for max_fail_percentage 13273 1726853309.59091: checking to see if all hosts have failed and the running result is not ok 13273 1726853309.59092: done checking to see if all hosts have failed 13273 1726853309.59093: getting the remaining hosts for this loop 13273 1726853309.59094: done getting the remaining hosts for this loop 13273 1726853309.59097: getting the next task for host managed_node3 13273 1726853309.59103: done getting next task for host managed_node3 13273 1726853309.59107: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 13273 1726853309.59110: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853309.59121: getting variables 13273 1726853309.59122: in VariableManager get_vars() 13273 1726853309.59164: Calling all_inventory to load vars for managed_node3 13273 1726853309.59167: Calling groups_inventory to load vars for managed_node3 13273 1726853309.59169: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853309.59185: Calling all_plugins_play to load vars for managed_node3 13273 1726853309.59188: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853309.59190: Calling groups_plugins_play to load vars for managed_node3 13273 1726853309.60030: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853309.60895: done with get_vars() 13273 1726853309.60912: done getting variables 13273 1726853309.60951: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 13:28:29 -0400 (0:00:00.036) 0:00:27.498 ****** 13273 1726853309.60977: entering _queue_task() for managed_node3/fail 13273 1726853309.61199: worker is 1 (out of 1 available) 13273 1726853309.61212: exiting _queue_task() for managed_node3/fail 13273 1726853309.61224: done queuing things up, now waiting for results queue to drain 13273 1726853309.61225: waiting for pending results... 13273 1726853309.61402: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 13273 1726853309.61489: in run() - task 02083763-bbaf-5fc3-657d-00000000007d 13273 1726853309.61500: variable 'ansible_search_path' from source: unknown 13273 1726853309.61505: variable 'ansible_search_path' from source: unknown 13273 1726853309.61531: calling self._execute() 13273 1726853309.61604: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853309.61608: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853309.61616: variable 'omit' from source: magic vars 13273 1726853309.61892: variable 'ansible_distribution_major_version' from source: facts 13273 1726853309.61897: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853309.61978: variable 'network_state' from source: role '' defaults 13273 1726853309.61985: Evaluated conditional (network_state != {}): False 13273 1726853309.61991: when evaluation is False, skipping this task 13273 1726853309.61994: _execute() done 13273 1726853309.61996: dumping result to json 13273 1726853309.61999: done dumping result, returning 13273 1726853309.62010: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [02083763-bbaf-5fc3-657d-00000000007d] 13273 1726853309.62013: sending task result for task 02083763-bbaf-5fc3-657d-00000000007d 13273 1726853309.62091: done sending task result for task 02083763-bbaf-5fc3-657d-00000000007d 13273 1726853309.62093: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 13273 1726853309.62151: no more pending results, returning what we have 13273 1726853309.62155: results queue empty 13273 1726853309.62156: checking for any_errors_fatal 13273 1726853309.62161: done checking for any_errors_fatal 13273 1726853309.62162: checking for max_fail_percentage 13273 1726853309.62163: done checking for max_fail_percentage 13273 1726853309.62164: checking to see if all hosts have failed and the running result is not ok 13273 1726853309.62164: done checking to see if all hosts have failed 13273 1726853309.62165: getting the remaining hosts for this loop 13273 1726853309.62166: done getting the remaining hosts for this loop 13273 1726853309.62169: getting the next task for host managed_node3 13273 1726853309.62177: done getting next task for host managed_node3 13273 1726853309.62180: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 13273 1726853309.62183: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853309.62197: getting variables 13273 1726853309.62198: in VariableManager get_vars() 13273 1726853309.62238: Calling all_inventory to load vars for managed_node3 13273 1726853309.62240: Calling groups_inventory to load vars for managed_node3 13273 1726853309.62242: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853309.62250: Calling all_plugins_play to load vars for managed_node3 13273 1726853309.62252: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853309.62254: Calling groups_plugins_play to load vars for managed_node3 13273 1726853309.62962: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853309.63825: done with get_vars() 13273 1726853309.63838: done getting variables 13273 1726853309.63879: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 13:28:29 -0400 (0:00:00.029) 0:00:27.528 ****** 13273 1726853309.63903: entering _queue_task() for managed_node3/fail 13273 1726853309.64102: worker is 1 (out of 1 available) 13273 1726853309.64115: exiting _queue_task() for managed_node3/fail 13273 1726853309.64130: done queuing things up, now waiting for results queue to drain 13273 1726853309.64131: waiting for pending results... 13273 1726853309.64301: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 13273 1726853309.64392: in run() - task 02083763-bbaf-5fc3-657d-00000000007e 13273 1726853309.64402: variable 'ansible_search_path' from source: unknown 13273 1726853309.64406: variable 'ansible_search_path' from source: unknown 13273 1726853309.64432: calling self._execute() 13273 1726853309.64507: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853309.64510: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853309.64520: variable 'omit' from source: magic vars 13273 1726853309.64783: variable 'ansible_distribution_major_version' from source: facts 13273 1726853309.64798: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853309.64876: variable 'network_state' from source: role '' defaults 13273 1726853309.64885: Evaluated conditional (network_state != {}): False 13273 1726853309.64888: when evaluation is False, skipping this task 13273 1726853309.64891: _execute() done 13273 1726853309.64894: dumping result to json 13273 1726853309.64896: done dumping result, returning 13273 1726853309.64903: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [02083763-bbaf-5fc3-657d-00000000007e] 13273 1726853309.64914: sending task result for task 02083763-bbaf-5fc3-657d-00000000007e 13273 1726853309.64999: done sending task result for task 02083763-bbaf-5fc3-657d-00000000007e 13273 1726853309.65001: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 13273 1726853309.65063: no more pending results, returning what we have 13273 1726853309.65066: results queue empty 13273 1726853309.65067: checking for any_errors_fatal 13273 1726853309.65076: done checking for any_errors_fatal 13273 1726853309.65077: checking for max_fail_percentage 13273 1726853309.65078: done checking for max_fail_percentage 13273 1726853309.65079: checking to see if all hosts have failed and the running result is not ok 13273 1726853309.65080: done checking to see if all hosts have failed 13273 1726853309.65080: getting the remaining hosts for this loop 13273 1726853309.65082: done getting the remaining hosts for this loop 13273 1726853309.65085: getting the next task for host managed_node3 13273 1726853309.65091: done getting next task for host managed_node3 13273 1726853309.65094: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 13273 1726853309.65097: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853309.65111: getting variables 13273 1726853309.65112: in VariableManager get_vars() 13273 1726853309.65155: Calling all_inventory to load vars for managed_node3 13273 1726853309.65158: Calling groups_inventory to load vars for managed_node3 13273 1726853309.65160: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853309.65168: Calling all_plugins_play to load vars for managed_node3 13273 1726853309.65170: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853309.65179: Calling groups_plugins_play to load vars for managed_node3 13273 1726853309.66041: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853309.66891: done with get_vars() 13273 1726853309.66904: done getting variables 13273 1726853309.66950: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 13:28:29 -0400 (0:00:00.030) 0:00:27.558 ****** 13273 1726853309.66974: entering _queue_task() for managed_node3/fail 13273 1726853309.67191: worker is 1 (out of 1 available) 13273 1726853309.67204: exiting _queue_task() for managed_node3/fail 13273 1726853309.67215: done queuing things up, now waiting for results queue to drain 13273 1726853309.67216: waiting for pending results... 13273 1726853309.67391: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 13273 1726853309.67478: in run() - task 02083763-bbaf-5fc3-657d-00000000007f 13273 1726853309.67489: variable 'ansible_search_path' from source: unknown 13273 1726853309.67492: variable 'ansible_search_path' from source: unknown 13273 1726853309.67519: calling self._execute() 13273 1726853309.67589: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853309.67595: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853309.67604: variable 'omit' from source: magic vars 13273 1726853309.67867: variable 'ansible_distribution_major_version' from source: facts 13273 1726853309.67879: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853309.67994: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13273 1726853309.69461: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13273 1726853309.69507: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13273 1726853309.69533: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13273 1726853309.69563: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13273 1726853309.69585: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13273 1726853309.69645: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853309.69680: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853309.69698: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853309.69724: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853309.69735: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853309.69805: variable 'ansible_distribution_major_version' from source: facts 13273 1726853309.69816: Evaluated conditional (ansible_distribution_major_version | int > 9): True 13273 1726853309.69897: variable 'ansible_distribution' from source: facts 13273 1726853309.69900: variable '__network_rh_distros' from source: role '' defaults 13273 1726853309.69908: Evaluated conditional (ansible_distribution in __network_rh_distros): True 13273 1726853309.70063: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853309.70086: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853309.70103: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853309.70128: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853309.70138: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853309.70175: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853309.70193: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853309.70209: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853309.70233: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853309.70243: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853309.70275: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853309.70296: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853309.70310: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853309.70334: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853309.70344: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853309.70533: variable 'network_connections' from source: task vars 13273 1726853309.70542: variable 'controller_profile' from source: play vars 13273 1726853309.70590: variable 'controller_profile' from source: play vars 13273 1726853309.70598: variable 'network_state' from source: role '' defaults 13273 1726853309.70646: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13273 1726853309.70758: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13273 1726853309.70785: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13273 1726853309.70808: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13273 1726853309.70831: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13273 1726853309.70863: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13273 1726853309.70879: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13273 1726853309.70900: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853309.70919: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13273 1726853309.70937: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 13273 1726853309.70942: when evaluation is False, skipping this task 13273 1726853309.70945: _execute() done 13273 1726853309.70947: dumping result to json 13273 1726853309.70950: done dumping result, returning 13273 1726853309.70960: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [02083763-bbaf-5fc3-657d-00000000007f] 13273 1726853309.70963: sending task result for task 02083763-bbaf-5fc3-657d-00000000007f 13273 1726853309.71045: done sending task result for task 02083763-bbaf-5fc3-657d-00000000007f 13273 1726853309.71048: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 13273 1726853309.71094: no more pending results, returning what we have 13273 1726853309.71097: results queue empty 13273 1726853309.71098: checking for any_errors_fatal 13273 1726853309.71103: done checking for any_errors_fatal 13273 1726853309.71104: checking for max_fail_percentage 13273 1726853309.71105: done checking for max_fail_percentage 13273 1726853309.71106: checking to see if all hosts have failed and the running result is not ok 13273 1726853309.71107: done checking to see if all hosts have failed 13273 1726853309.71107: getting the remaining hosts for this loop 13273 1726853309.71108: done getting the remaining hosts for this loop 13273 1726853309.71112: getting the next task for host managed_node3 13273 1726853309.71117: done getting next task for host managed_node3 13273 1726853309.71120: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 13273 1726853309.71122: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853309.71139: getting variables 13273 1726853309.71140: in VariableManager get_vars() 13273 1726853309.71188: Calling all_inventory to load vars for managed_node3 13273 1726853309.71191: Calling groups_inventory to load vars for managed_node3 13273 1726853309.71193: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853309.71201: Calling all_plugins_play to load vars for managed_node3 13273 1726853309.71204: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853309.71206: Calling groups_plugins_play to load vars for managed_node3 13273 1726853309.71996: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853309.72858: done with get_vars() 13273 1726853309.72875: done getting variables 13273 1726853309.72917: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 13:28:29 -0400 (0:00:00.059) 0:00:27.618 ****** 13273 1726853309.72940: entering _queue_task() for managed_node3/dnf 13273 1726853309.73159: worker is 1 (out of 1 available) 13273 1726853309.73177: exiting _queue_task() for managed_node3/dnf 13273 1726853309.73188: done queuing things up, now waiting for results queue to drain 13273 1726853309.73189: waiting for pending results... 13273 1726853309.73357: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 13273 1726853309.73445: in run() - task 02083763-bbaf-5fc3-657d-000000000080 13273 1726853309.73457: variable 'ansible_search_path' from source: unknown 13273 1726853309.73461: variable 'ansible_search_path' from source: unknown 13273 1726853309.73491: calling self._execute() 13273 1726853309.73563: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853309.73567: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853309.73577: variable 'omit' from source: magic vars 13273 1726853309.73841: variable 'ansible_distribution_major_version' from source: facts 13273 1726853309.73853: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853309.73995: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13273 1726853309.75687: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13273 1726853309.75732: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13273 1726853309.75758: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13273 1726853309.75784: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13273 1726853309.75805: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13273 1726853309.75863: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853309.75884: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853309.75901: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853309.75931: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853309.75942: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853309.76019: variable 'ansible_distribution' from source: facts 13273 1726853309.76025: variable 'ansible_distribution_major_version' from source: facts 13273 1726853309.76037: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 13273 1726853309.76113: variable '__network_wireless_connections_defined' from source: role '' defaults 13273 1726853309.76198: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853309.76214: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853309.76232: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853309.76262: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853309.76274: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853309.76301: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853309.76317: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853309.76332: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853309.76364: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853309.76368: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853309.76398: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853309.76413: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853309.76429: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853309.76453: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853309.76464: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853309.76569: variable 'network_connections' from source: task vars 13273 1726853309.76583: variable 'controller_profile' from source: play vars 13273 1726853309.76625: variable 'controller_profile' from source: play vars 13273 1726853309.76675: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13273 1726853309.76783: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13273 1726853309.76812: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13273 1726853309.76833: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13273 1726853309.76855: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13273 1726853309.76885: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13273 1726853309.76904: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13273 1726853309.76924: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853309.76942: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13273 1726853309.76978: variable '__network_team_connections_defined' from source: role '' defaults 13273 1726853309.77138: variable 'network_connections' from source: task vars 13273 1726853309.77141: variable 'controller_profile' from source: play vars 13273 1726853309.77169: variable 'controller_profile' from source: play vars 13273 1726853309.77188: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 13273 1726853309.77191: when evaluation is False, skipping this task 13273 1726853309.77194: _execute() done 13273 1726853309.77196: dumping result to json 13273 1726853309.77199: done dumping result, returning 13273 1726853309.77207: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [02083763-bbaf-5fc3-657d-000000000080] 13273 1726853309.77210: sending task result for task 02083763-bbaf-5fc3-657d-000000000080 13273 1726853309.77299: done sending task result for task 02083763-bbaf-5fc3-657d-000000000080 13273 1726853309.77302: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 13273 1726853309.77380: no more pending results, returning what we have 13273 1726853309.77383: results queue empty 13273 1726853309.77384: checking for any_errors_fatal 13273 1726853309.77389: done checking for any_errors_fatal 13273 1726853309.77390: checking for max_fail_percentage 13273 1726853309.77392: done checking for max_fail_percentage 13273 1726853309.77392: checking to see if all hosts have failed and the running result is not ok 13273 1726853309.77393: done checking to see if all hosts have failed 13273 1726853309.77394: getting the remaining hosts for this loop 13273 1726853309.77395: done getting the remaining hosts for this loop 13273 1726853309.77398: getting the next task for host managed_node3 13273 1726853309.77403: done getting next task for host managed_node3 13273 1726853309.77407: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 13273 1726853309.77409: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853309.77429: getting variables 13273 1726853309.77430: in VariableManager get_vars() 13273 1726853309.77479: Calling all_inventory to load vars for managed_node3 13273 1726853309.77482: Calling groups_inventory to load vars for managed_node3 13273 1726853309.77484: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853309.77492: Calling all_plugins_play to load vars for managed_node3 13273 1726853309.77494: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853309.77496: Calling groups_plugins_play to load vars for managed_node3 13273 1726853309.78355: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853309.79210: done with get_vars() 13273 1726853309.79224: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 13273 1726853309.79279: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 13:28:29 -0400 (0:00:00.063) 0:00:27.682 ****** 13273 1726853309.79303: entering _queue_task() for managed_node3/yum 13273 1726853309.79528: worker is 1 (out of 1 available) 13273 1726853309.79541: exiting _queue_task() for managed_node3/yum 13273 1726853309.79555: done queuing things up, now waiting for results queue to drain 13273 1726853309.79557: waiting for pending results... 13273 1726853309.79729: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 13273 1726853309.79815: in run() - task 02083763-bbaf-5fc3-657d-000000000081 13273 1726853309.79827: variable 'ansible_search_path' from source: unknown 13273 1726853309.79831: variable 'ansible_search_path' from source: unknown 13273 1726853309.79860: calling self._execute() 13273 1726853309.79942: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853309.79951: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853309.79960: variable 'omit' from source: magic vars 13273 1726853309.80238: variable 'ansible_distribution_major_version' from source: facts 13273 1726853309.80249: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853309.80369: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13273 1726853309.81865: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13273 1726853309.81907: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13273 1726853309.81932: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13273 1726853309.81962: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13273 1726853309.81984: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13273 1726853309.82039: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853309.82074: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853309.82095: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853309.82120: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853309.82131: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853309.82205: variable 'ansible_distribution_major_version' from source: facts 13273 1726853309.82217: Evaluated conditional (ansible_distribution_major_version | int < 8): False 13273 1726853309.82222: when evaluation is False, skipping this task 13273 1726853309.82225: _execute() done 13273 1726853309.82228: dumping result to json 13273 1726853309.82230: done dumping result, returning 13273 1726853309.82237: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [02083763-bbaf-5fc3-657d-000000000081] 13273 1726853309.82240: sending task result for task 02083763-bbaf-5fc3-657d-000000000081 13273 1726853309.82325: done sending task result for task 02083763-bbaf-5fc3-657d-000000000081 13273 1726853309.82328: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 13273 1726853309.82376: no more pending results, returning what we have 13273 1726853309.82379: results queue empty 13273 1726853309.82380: checking for any_errors_fatal 13273 1726853309.82386: done checking for any_errors_fatal 13273 1726853309.82387: checking for max_fail_percentage 13273 1726853309.82388: done checking for max_fail_percentage 13273 1726853309.82389: checking to see if all hosts have failed and the running result is not ok 13273 1726853309.82390: done checking to see if all hosts have failed 13273 1726853309.82390: getting the remaining hosts for this loop 13273 1726853309.82392: done getting the remaining hosts for this loop 13273 1726853309.82395: getting the next task for host managed_node3 13273 1726853309.82401: done getting next task for host managed_node3 13273 1726853309.82404: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 13273 1726853309.82406: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853309.82423: getting variables 13273 1726853309.82424: in VariableManager get_vars() 13273 1726853309.82473: Calling all_inventory to load vars for managed_node3 13273 1726853309.82476: Calling groups_inventory to load vars for managed_node3 13273 1726853309.82478: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853309.82486: Calling all_plugins_play to load vars for managed_node3 13273 1726853309.82489: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853309.82491: Calling groups_plugins_play to load vars for managed_node3 13273 1726853309.83261: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853309.84213: done with get_vars() 13273 1726853309.84228: done getting variables 13273 1726853309.84268: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 13:28:29 -0400 (0:00:00.049) 0:00:27.732 ****** 13273 1726853309.84293: entering _queue_task() for managed_node3/fail 13273 1726853309.84513: worker is 1 (out of 1 available) 13273 1726853309.84527: exiting _queue_task() for managed_node3/fail 13273 1726853309.84538: done queuing things up, now waiting for results queue to drain 13273 1726853309.84539: waiting for pending results... 13273 1726853309.84720: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 13273 1726853309.84811: in run() - task 02083763-bbaf-5fc3-657d-000000000082 13273 1726853309.84822: variable 'ansible_search_path' from source: unknown 13273 1726853309.84825: variable 'ansible_search_path' from source: unknown 13273 1726853309.84854: calling self._execute() 13273 1726853309.84929: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853309.84933: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853309.84944: variable 'omit' from source: magic vars 13273 1726853309.85209: variable 'ansible_distribution_major_version' from source: facts 13273 1726853309.85218: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853309.85297: variable '__network_wireless_connections_defined' from source: role '' defaults 13273 1726853309.85426: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13273 1726853309.86862: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13273 1726853309.86905: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13273 1726853309.86930: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13273 1726853309.86959: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13273 1726853309.86981: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13273 1726853309.87038: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853309.87073: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853309.87090: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853309.87115: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853309.87126: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853309.87164: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853309.87180: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853309.87197: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853309.87221: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853309.87231: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853309.87260: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853309.87280: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853309.87296: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853309.87320: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853309.87330: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853309.87442: variable 'network_connections' from source: task vars 13273 1726853309.87451: variable 'controller_profile' from source: play vars 13273 1726853309.87500: variable 'controller_profile' from source: play vars 13273 1726853309.87549: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13273 1726853309.87656: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13273 1726853309.87683: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13273 1726853309.87710: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13273 1726853309.87730: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13273 1726853309.87760: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13273 1726853309.87775: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13273 1726853309.87793: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853309.87816: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13273 1726853309.87849: variable '__network_team_connections_defined' from source: role '' defaults 13273 1726853309.88000: variable 'network_connections' from source: task vars 13273 1726853309.88003: variable 'controller_profile' from source: play vars 13273 1726853309.88047: variable 'controller_profile' from source: play vars 13273 1726853309.88070: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 13273 1726853309.88075: when evaluation is False, skipping this task 13273 1726853309.88078: _execute() done 13273 1726853309.88081: dumping result to json 13273 1726853309.88083: done dumping result, returning 13273 1726853309.88092: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [02083763-bbaf-5fc3-657d-000000000082] 13273 1726853309.88096: sending task result for task 02083763-bbaf-5fc3-657d-000000000082 13273 1726853309.88182: done sending task result for task 02083763-bbaf-5fc3-657d-000000000082 13273 1726853309.88184: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 13273 1726853309.88233: no more pending results, returning what we have 13273 1726853309.88236: results queue empty 13273 1726853309.88237: checking for any_errors_fatal 13273 1726853309.88242: done checking for any_errors_fatal 13273 1726853309.88244: checking for max_fail_percentage 13273 1726853309.88246: done checking for max_fail_percentage 13273 1726853309.88247: checking to see if all hosts have failed and the running result is not ok 13273 1726853309.88247: done checking to see if all hosts have failed 13273 1726853309.88248: getting the remaining hosts for this loop 13273 1726853309.88249: done getting the remaining hosts for this loop 13273 1726853309.88252: getting the next task for host managed_node3 13273 1726853309.88259: done getting next task for host managed_node3 13273 1726853309.88262: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 13273 1726853309.88265: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853309.88287: getting variables 13273 1726853309.88289: in VariableManager get_vars() 13273 1726853309.88338: Calling all_inventory to load vars for managed_node3 13273 1726853309.88341: Calling groups_inventory to load vars for managed_node3 13273 1726853309.88345: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853309.88353: Calling all_plugins_play to load vars for managed_node3 13273 1726853309.88356: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853309.88358: Calling groups_plugins_play to load vars for managed_node3 13273 1726853309.89135: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853309.89992: done with get_vars() 13273 1726853309.90010: done getting variables 13273 1726853309.90053: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 13:28:29 -0400 (0:00:00.057) 0:00:27.789 ****** 13273 1726853309.90079: entering _queue_task() for managed_node3/package 13273 1726853309.90310: worker is 1 (out of 1 available) 13273 1726853309.90322: exiting _queue_task() for managed_node3/package 13273 1726853309.90333: done queuing things up, now waiting for results queue to drain 13273 1726853309.90334: waiting for pending results... 13273 1726853309.90505: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages 13273 1726853309.90595: in run() - task 02083763-bbaf-5fc3-657d-000000000083 13273 1726853309.90606: variable 'ansible_search_path' from source: unknown 13273 1726853309.90610: variable 'ansible_search_path' from source: unknown 13273 1726853309.90638: calling self._execute() 13273 1726853309.90707: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853309.90713: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853309.90721: variable 'omit' from source: magic vars 13273 1726853309.90979: variable 'ansible_distribution_major_version' from source: facts 13273 1726853309.90988: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853309.91122: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13273 1726853309.91304: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13273 1726853309.91337: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13273 1726853309.91364: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13273 1726853309.91412: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13273 1726853309.91487: variable 'network_packages' from source: role '' defaults 13273 1726853309.91558: variable '__network_provider_setup' from source: role '' defaults 13273 1726853309.91567: variable '__network_service_name_default_nm' from source: role '' defaults 13273 1726853309.91613: variable '__network_service_name_default_nm' from source: role '' defaults 13273 1726853309.91620: variable '__network_packages_default_nm' from source: role '' defaults 13273 1726853309.91664: variable '__network_packages_default_nm' from source: role '' defaults 13273 1726853309.91776: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13273 1726853309.93282: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13273 1726853309.93320: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13273 1726853309.93347: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13273 1726853309.93369: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13273 1726853309.93390: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13273 1726853309.93448: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853309.93466: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853309.93485: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853309.93514: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853309.93524: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853309.93555: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853309.93572: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853309.93588: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853309.93616: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853309.93625: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853309.93760: variable '__network_packages_default_gobject_packages' from source: role '' defaults 13273 1726853309.93833: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853309.93848: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853309.93864: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853309.93889: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853309.93900: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853309.93960: variable 'ansible_python' from source: facts 13273 1726853309.93980: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 13273 1726853309.94033: variable '__network_wpa_supplicant_required' from source: role '' defaults 13273 1726853309.94091: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 13273 1726853309.94172: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853309.94189: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853309.94205: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853309.94230: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853309.94240: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853309.94277: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853309.94296: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853309.94311: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853309.94335: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853309.94348: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853309.94439: variable 'network_connections' from source: task vars 13273 1726853309.94445: variable 'controller_profile' from source: play vars 13273 1726853309.94515: variable 'controller_profile' from source: play vars 13273 1726853309.94561: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13273 1726853309.94583: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13273 1726853309.94606: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853309.94627: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13273 1726853309.94660: variable '__network_wireless_connections_defined' from source: role '' defaults 13273 1726853309.94832: variable 'network_connections' from source: task vars 13273 1726853309.94836: variable 'controller_profile' from source: play vars 13273 1726853309.94904: variable 'controller_profile' from source: play vars 13273 1726853309.94931: variable '__network_packages_default_wireless' from source: role '' defaults 13273 1726853309.94984: variable '__network_wireless_connections_defined' from source: role '' defaults 13273 1726853309.95185: variable 'network_connections' from source: task vars 13273 1726853309.95189: variable 'controller_profile' from source: play vars 13273 1726853309.95232: variable 'controller_profile' from source: play vars 13273 1726853309.95251: variable '__network_packages_default_team' from source: role '' defaults 13273 1726853309.95304: variable '__network_team_connections_defined' from source: role '' defaults 13273 1726853309.95497: variable 'network_connections' from source: task vars 13273 1726853309.95500: variable 'controller_profile' from source: play vars 13273 1726853309.95547: variable 'controller_profile' from source: play vars 13273 1726853309.95584: variable '__network_service_name_default_initscripts' from source: role '' defaults 13273 1726853309.95623: variable '__network_service_name_default_initscripts' from source: role '' defaults 13273 1726853309.95629: variable '__network_packages_default_initscripts' from source: role '' defaults 13273 1726853309.95672: variable '__network_packages_default_initscripts' from source: role '' defaults 13273 1726853309.95803: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 13273 1726853309.96089: variable 'network_connections' from source: task vars 13273 1726853309.96092: variable 'controller_profile' from source: play vars 13273 1726853309.96136: variable 'controller_profile' from source: play vars 13273 1726853309.96144: variable 'ansible_distribution' from source: facts 13273 1726853309.96148: variable '__network_rh_distros' from source: role '' defaults 13273 1726853309.96150: variable 'ansible_distribution_major_version' from source: facts 13273 1726853309.96161: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 13273 1726853309.96266: variable 'ansible_distribution' from source: facts 13273 1726853309.96270: variable '__network_rh_distros' from source: role '' defaults 13273 1726853309.96275: variable 'ansible_distribution_major_version' from source: facts 13273 1726853309.96286: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 13273 1726853309.96390: variable 'ansible_distribution' from source: facts 13273 1726853309.96394: variable '__network_rh_distros' from source: role '' defaults 13273 1726853309.96397: variable 'ansible_distribution_major_version' from source: facts 13273 1726853309.96421: variable 'network_provider' from source: set_fact 13273 1726853309.96432: variable 'ansible_facts' from source: unknown 13273 1726853309.96774: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 13273 1726853309.96778: when evaluation is False, skipping this task 13273 1726853309.96780: _execute() done 13273 1726853309.96783: dumping result to json 13273 1726853309.96785: done dumping result, returning 13273 1726853309.96791: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages [02083763-bbaf-5fc3-657d-000000000083] 13273 1726853309.96796: sending task result for task 02083763-bbaf-5fc3-657d-000000000083 13273 1726853309.96882: done sending task result for task 02083763-bbaf-5fc3-657d-000000000083 13273 1726853309.96885: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 13273 1726853309.96933: no more pending results, returning what we have 13273 1726853309.96936: results queue empty 13273 1726853309.96937: checking for any_errors_fatal 13273 1726853309.96945: done checking for any_errors_fatal 13273 1726853309.96945: checking for max_fail_percentage 13273 1726853309.96947: done checking for max_fail_percentage 13273 1726853309.96948: checking to see if all hosts have failed and the running result is not ok 13273 1726853309.96948: done checking to see if all hosts have failed 13273 1726853309.96949: getting the remaining hosts for this loop 13273 1726853309.96950: done getting the remaining hosts for this loop 13273 1726853309.96953: getting the next task for host managed_node3 13273 1726853309.96959: done getting next task for host managed_node3 13273 1726853309.96962: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 13273 1726853309.96965: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853309.96983: getting variables 13273 1726853309.96985: in VariableManager get_vars() 13273 1726853309.97033: Calling all_inventory to load vars for managed_node3 13273 1726853309.97036: Calling groups_inventory to load vars for managed_node3 13273 1726853309.97039: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853309.97049: Calling all_plugins_play to load vars for managed_node3 13273 1726853309.97052: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853309.97054: Calling groups_plugins_play to load vars for managed_node3 13273 1726853309.97984: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853309.98825: done with get_vars() 13273 1726853309.98840: done getting variables 13273 1726853309.98883: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 13:28:29 -0400 (0:00:00.088) 0:00:27.878 ****** 13273 1726853309.98905: entering _queue_task() for managed_node3/package 13273 1726853309.99123: worker is 1 (out of 1 available) 13273 1726853309.99135: exiting _queue_task() for managed_node3/package 13273 1726853309.99147: done queuing things up, now waiting for results queue to drain 13273 1726853309.99148: waiting for pending results... 13273 1726853309.99326: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 13273 1726853309.99417: in run() - task 02083763-bbaf-5fc3-657d-000000000084 13273 1726853309.99429: variable 'ansible_search_path' from source: unknown 13273 1726853309.99432: variable 'ansible_search_path' from source: unknown 13273 1726853309.99463: calling self._execute() 13273 1726853309.99534: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853309.99539: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853309.99549: variable 'omit' from source: magic vars 13273 1726853309.99815: variable 'ansible_distribution_major_version' from source: facts 13273 1726853309.99823: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853309.99903: variable 'network_state' from source: role '' defaults 13273 1726853309.99911: Evaluated conditional (network_state != {}): False 13273 1726853309.99918: when evaluation is False, skipping this task 13273 1726853309.99922: _execute() done 13273 1726853309.99926: dumping result to json 13273 1726853309.99929: done dumping result, returning 13273 1726853309.99931: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [02083763-bbaf-5fc3-657d-000000000084] 13273 1726853309.99935: sending task result for task 02083763-bbaf-5fc3-657d-000000000084 13273 1726853310.00021: done sending task result for task 02083763-bbaf-5fc3-657d-000000000084 13273 1726853310.00024: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 13273 1726853310.00092: no more pending results, returning what we have 13273 1726853310.00095: results queue empty 13273 1726853310.00096: checking for any_errors_fatal 13273 1726853310.00102: done checking for any_errors_fatal 13273 1726853310.00103: checking for max_fail_percentage 13273 1726853310.00104: done checking for max_fail_percentage 13273 1726853310.00104: checking to see if all hosts have failed and the running result is not ok 13273 1726853310.00105: done checking to see if all hosts have failed 13273 1726853310.00106: getting the remaining hosts for this loop 13273 1726853310.00107: done getting the remaining hosts for this loop 13273 1726853310.00110: getting the next task for host managed_node3 13273 1726853310.00115: done getting next task for host managed_node3 13273 1726853310.00119: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 13273 1726853310.00121: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853310.00136: getting variables 13273 1726853310.00137: in VariableManager get_vars() 13273 1726853310.00179: Calling all_inventory to load vars for managed_node3 13273 1726853310.00182: Calling groups_inventory to load vars for managed_node3 13273 1726853310.00184: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853310.00192: Calling all_plugins_play to load vars for managed_node3 13273 1726853310.00194: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853310.00197: Calling groups_plugins_play to load vars for managed_node3 13273 1726853310.00934: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853310.01866: done with get_vars() 13273 1726853310.01882: done getting variables 13273 1726853310.01920: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 13:28:30 -0400 (0:00:00.030) 0:00:27.908 ****** 13273 1726853310.01945: entering _queue_task() for managed_node3/package 13273 1726853310.02146: worker is 1 (out of 1 available) 13273 1726853310.02158: exiting _queue_task() for managed_node3/package 13273 1726853310.02170: done queuing things up, now waiting for results queue to drain 13273 1726853310.02172: waiting for pending results... 13273 1726853310.02338: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 13273 1726853310.02424: in run() - task 02083763-bbaf-5fc3-657d-000000000085 13273 1726853310.02435: variable 'ansible_search_path' from source: unknown 13273 1726853310.02439: variable 'ansible_search_path' from source: unknown 13273 1726853310.02465: calling self._execute() 13273 1726853310.02604: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853310.02611: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853310.02614: variable 'omit' from source: magic vars 13273 1726853310.03077: variable 'ansible_distribution_major_version' from source: facts 13273 1726853310.03081: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853310.03084: variable 'network_state' from source: role '' defaults 13273 1726853310.03087: Evaluated conditional (network_state != {}): False 13273 1726853310.03089: when evaluation is False, skipping this task 13273 1726853310.03092: _execute() done 13273 1726853310.03098: dumping result to json 13273 1726853310.03103: done dumping result, returning 13273 1726853310.03114: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [02083763-bbaf-5fc3-657d-000000000085] 13273 1726853310.03123: sending task result for task 02083763-bbaf-5fc3-657d-000000000085 skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 13273 1726853310.03270: no more pending results, returning what we have 13273 1726853310.03276: results queue empty 13273 1726853310.03277: checking for any_errors_fatal 13273 1726853310.03281: done checking for any_errors_fatal 13273 1726853310.03282: checking for max_fail_percentage 13273 1726853310.03283: done checking for max_fail_percentage 13273 1726853310.03284: checking to see if all hosts have failed and the running result is not ok 13273 1726853310.03284: done checking to see if all hosts have failed 13273 1726853310.03285: getting the remaining hosts for this loop 13273 1726853310.03286: done getting the remaining hosts for this loop 13273 1726853310.03289: getting the next task for host managed_node3 13273 1726853310.03294: done getting next task for host managed_node3 13273 1726853310.03298: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 13273 1726853310.03301: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853310.03331: getting variables 13273 1726853310.03332: in VariableManager get_vars() 13273 1726853310.03387: Calling all_inventory to load vars for managed_node3 13273 1726853310.03390: Calling groups_inventory to load vars for managed_node3 13273 1726853310.03392: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853310.03402: Calling all_plugins_play to load vars for managed_node3 13273 1726853310.03405: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853310.03408: Calling groups_plugins_play to load vars for managed_node3 13273 1726853310.03926: done sending task result for task 02083763-bbaf-5fc3-657d-000000000085 13273 1726853310.03929: WORKER PROCESS EXITING 13273 1726853310.04727: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853310.05566: done with get_vars() 13273 1726853310.05582: done getting variables 13273 1726853310.05620: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 13:28:30 -0400 (0:00:00.037) 0:00:27.945 ****** 13273 1726853310.05645: entering _queue_task() for managed_node3/service 13273 1726853310.05839: worker is 1 (out of 1 available) 13273 1726853310.05854: exiting _queue_task() for managed_node3/service 13273 1726853310.05865: done queuing things up, now waiting for results queue to drain 13273 1726853310.05866: waiting for pending results... 13273 1726853310.06037: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 13273 1726853310.06124: in run() - task 02083763-bbaf-5fc3-657d-000000000086 13273 1726853310.06134: variable 'ansible_search_path' from source: unknown 13273 1726853310.06137: variable 'ansible_search_path' from source: unknown 13273 1726853310.06167: calling self._execute() 13273 1726853310.06237: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853310.06242: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853310.06252: variable 'omit' from source: magic vars 13273 1726853310.06776: variable 'ansible_distribution_major_version' from source: facts 13273 1726853310.06779: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853310.06781: variable '__network_wireless_connections_defined' from source: role '' defaults 13273 1726853310.06904: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13273 1726853310.08978: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13273 1726853310.09050: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13273 1726853310.09095: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13273 1726853310.09131: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13273 1726853310.09161: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13273 1726853310.09251: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853310.09299: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853310.09329: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853310.09378: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853310.09396: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853310.09447: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853310.09476: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853310.09504: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853310.09548: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853310.09566: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853310.09610: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853310.09641: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853310.09676: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853310.09721: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853310.09738: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853310.09911: variable 'network_connections' from source: task vars 13273 1726853310.09929: variable 'controller_profile' from source: play vars 13273 1726853310.10001: variable 'controller_profile' from source: play vars 13273 1726853310.10079: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13273 1726853310.10246: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13273 1726853310.10289: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13273 1726853310.10322: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13273 1726853310.10358: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13273 1726853310.10407: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13273 1726853310.10434: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13273 1726853310.10467: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853310.10498: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13273 1726853310.10776: variable '__network_team_connections_defined' from source: role '' defaults 13273 1726853310.10796: variable 'network_connections' from source: task vars 13273 1726853310.10806: variable 'controller_profile' from source: play vars 13273 1726853310.10873: variable 'controller_profile' from source: play vars 13273 1726853310.10903: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 13273 1726853310.10911: when evaluation is False, skipping this task 13273 1726853310.10917: _execute() done 13273 1726853310.10923: dumping result to json 13273 1726853310.10929: done dumping result, returning 13273 1726853310.10940: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [02083763-bbaf-5fc3-657d-000000000086] 13273 1726853310.10953: sending task result for task 02083763-bbaf-5fc3-657d-000000000086 skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 13273 1726853310.11103: done sending task result for task 02083763-bbaf-5fc3-657d-000000000086 13273 1726853310.11120: WORKER PROCESS EXITING 13273 1726853310.11116: no more pending results, returning what we have 13273 1726853310.11124: results queue empty 13273 1726853310.11125: checking for any_errors_fatal 13273 1726853310.11129: done checking for any_errors_fatal 13273 1726853310.11130: checking for max_fail_percentage 13273 1726853310.11132: done checking for max_fail_percentage 13273 1726853310.11132: checking to see if all hosts have failed and the running result is not ok 13273 1726853310.11133: done checking to see if all hosts have failed 13273 1726853310.11133: getting the remaining hosts for this loop 13273 1726853310.11135: done getting the remaining hosts for this loop 13273 1726853310.11138: getting the next task for host managed_node3 13273 1726853310.11144: done getting next task for host managed_node3 13273 1726853310.11148: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 13273 1726853310.11150: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853310.11383: getting variables 13273 1726853310.11385: in VariableManager get_vars() 13273 1726853310.11432: Calling all_inventory to load vars for managed_node3 13273 1726853310.11435: Calling groups_inventory to load vars for managed_node3 13273 1726853310.11437: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853310.11445: Calling all_plugins_play to load vars for managed_node3 13273 1726853310.11447: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853310.11450: Calling groups_plugins_play to load vars for managed_node3 13273 1726853310.12723: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853310.15098: done with get_vars() 13273 1726853310.15122: done getting variables 13273 1726853310.15400: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 13:28:30 -0400 (0:00:00.097) 0:00:28.043 ****** 13273 1726853310.15435: entering _queue_task() for managed_node3/service 13273 1726853310.15892: worker is 1 (out of 1 available) 13273 1726853310.15905: exiting _queue_task() for managed_node3/service 13273 1726853310.15918: done queuing things up, now waiting for results queue to drain 13273 1726853310.15919: waiting for pending results... 13273 1726853310.16224: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 13273 1726853310.16365: in run() - task 02083763-bbaf-5fc3-657d-000000000087 13273 1726853310.16393: variable 'ansible_search_path' from source: unknown 13273 1726853310.16404: variable 'ansible_search_path' from source: unknown 13273 1726853310.16448: calling self._execute() 13273 1726853310.16556: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853310.16568: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853310.16584: variable 'omit' from source: magic vars 13273 1726853310.16976: variable 'ansible_distribution_major_version' from source: facts 13273 1726853310.16996: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853310.17176: variable 'network_provider' from source: set_fact 13273 1726853310.17190: variable 'network_state' from source: role '' defaults 13273 1726853310.17206: Evaluated conditional (network_provider == "nm" or network_state != {}): True 13273 1726853310.17268: variable 'omit' from source: magic vars 13273 1726853310.17287: variable 'omit' from source: magic vars 13273 1726853310.17322: variable 'network_service_name' from source: role '' defaults 13273 1726853310.17399: variable 'network_service_name' from source: role '' defaults 13273 1726853310.17518: variable '__network_provider_setup' from source: role '' defaults 13273 1726853310.17531: variable '__network_service_name_default_nm' from source: role '' defaults 13273 1726853310.17622: variable '__network_service_name_default_nm' from source: role '' defaults 13273 1726853310.17639: variable '__network_packages_default_nm' from source: role '' defaults 13273 1726853310.17921: variable '__network_packages_default_nm' from source: role '' defaults 13273 1726853310.18153: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13273 1726853310.20633: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13273 1726853310.20678: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13273 1726853310.20706: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13273 1726853310.20741: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13273 1726853310.20761: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13273 1726853310.20817: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853310.20840: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853310.20858: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853310.20887: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853310.20897: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853310.20927: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853310.20948: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853310.20963: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853310.20990: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853310.21001: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853310.21376: variable '__network_packages_default_gobject_packages' from source: role '' defaults 13273 1726853310.21379: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853310.21382: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853310.21384: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853310.21387: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853310.21390: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853310.21475: variable 'ansible_python' from source: facts 13273 1726853310.21499: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 13273 1726853310.21577: variable '__network_wpa_supplicant_required' from source: role '' defaults 13273 1726853310.21655: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 13273 1726853310.21782: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853310.21812: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853310.21842: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853310.21888: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853310.21909: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853310.21958: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853310.21998: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853310.22028: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853310.22074: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853310.22095: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853310.22225: variable 'network_connections' from source: task vars 13273 1726853310.22237: variable 'controller_profile' from source: play vars 13273 1726853310.22310: variable 'controller_profile' from source: play vars 13273 1726853310.22413: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13273 1726853310.22588: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13273 1726853310.22649: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13273 1726853310.22699: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13273 1726853310.22743: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13273 1726853310.22788: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13273 1726853310.22828: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13273 1726853310.22851: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853310.22875: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13273 1726853310.22911: variable '__network_wireless_connections_defined' from source: role '' defaults 13273 1726853310.23088: variable 'network_connections' from source: task vars 13273 1726853310.23093: variable 'controller_profile' from source: play vars 13273 1726853310.23149: variable 'controller_profile' from source: play vars 13273 1726853310.23169: variable '__network_packages_default_wireless' from source: role '' defaults 13273 1726853310.23225: variable '__network_wireless_connections_defined' from source: role '' defaults 13273 1726853310.23415: variable 'network_connections' from source: task vars 13273 1726853310.23419: variable 'controller_profile' from source: play vars 13273 1726853310.23475: variable 'controller_profile' from source: play vars 13273 1726853310.23489: variable '__network_packages_default_team' from source: role '' defaults 13273 1726853310.23539: variable '__network_team_connections_defined' from source: role '' defaults 13273 1726853310.23724: variable 'network_connections' from source: task vars 13273 1726853310.23727: variable 'controller_profile' from source: play vars 13273 1726853310.23778: variable 'controller_profile' from source: play vars 13273 1726853310.23816: variable '__network_service_name_default_initscripts' from source: role '' defaults 13273 1726853310.23858: variable '__network_service_name_default_initscripts' from source: role '' defaults 13273 1726853310.23863: variable '__network_packages_default_initscripts' from source: role '' defaults 13273 1726853310.23908: variable '__network_packages_default_initscripts' from source: role '' defaults 13273 1726853310.24042: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 13273 1726853310.24575: variable 'network_connections' from source: task vars 13273 1726853310.24578: variable 'controller_profile' from source: play vars 13273 1726853310.24584: variable 'controller_profile' from source: play vars 13273 1726853310.24586: variable 'ansible_distribution' from source: facts 13273 1726853310.24588: variable '__network_rh_distros' from source: role '' defaults 13273 1726853310.24589: variable 'ansible_distribution_major_version' from source: facts 13273 1726853310.24594: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 13273 1726853310.24766: variable 'ansible_distribution' from source: facts 13273 1726853310.24778: variable '__network_rh_distros' from source: role '' defaults 13273 1726853310.24788: variable 'ansible_distribution_major_version' from source: facts 13273 1726853310.24803: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 13273 1726853310.24996: variable 'ansible_distribution' from source: facts 13273 1726853310.25005: variable '__network_rh_distros' from source: role '' defaults 13273 1726853310.25014: variable 'ansible_distribution_major_version' from source: facts 13273 1726853310.25054: variable 'network_provider' from source: set_fact 13273 1726853310.25089: variable 'omit' from source: magic vars 13273 1726853310.25118: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13273 1726853310.25151: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13273 1726853310.25173: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13273 1726853310.25197: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853310.25218: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853310.25375: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13273 1726853310.25378: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853310.25386: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853310.25388: Set connection var ansible_connection to ssh 13273 1726853310.25390: Set connection var ansible_module_compression to ZIP_DEFLATED 13273 1726853310.25391: Set connection var ansible_shell_executable to /bin/sh 13273 1726853310.25393: Set connection var ansible_shell_type to sh 13273 1726853310.25395: Set connection var ansible_pipelining to False 13273 1726853310.25396: Set connection var ansible_timeout to 10 13273 1726853310.25426: variable 'ansible_shell_executable' from source: unknown 13273 1726853310.25433: variable 'ansible_connection' from source: unknown 13273 1726853310.25439: variable 'ansible_module_compression' from source: unknown 13273 1726853310.25447: variable 'ansible_shell_type' from source: unknown 13273 1726853310.25453: variable 'ansible_shell_executable' from source: unknown 13273 1726853310.25458: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853310.25467: variable 'ansible_pipelining' from source: unknown 13273 1726853310.25476: variable 'ansible_timeout' from source: unknown 13273 1726853310.25484: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853310.25564: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13273 1726853310.25584: variable 'omit' from source: magic vars 13273 1726853310.25589: starting attempt loop 13273 1726853310.25591: running the handler 13273 1726853310.25683: variable 'ansible_facts' from source: unknown 13273 1726853310.26500: _low_level_execute_command(): starting 13273 1726853310.26505: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13273 1726853310.26993: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853310.27010: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration <<< 13273 1726853310.27022: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853310.27069: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853310.27085: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853310.27158: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853310.28880: stdout chunk (state=3): >>>/root <<< 13273 1726853310.29030: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853310.29033: stdout chunk (state=3): >>><<< 13273 1726853310.29036: stderr chunk (state=3): >>><<< 13273 1726853310.29051: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853310.29067: _low_level_execute_command(): starting 13273 1726853310.29143: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853310.2905712-14612-55308479566594 `" && echo ansible-tmp-1726853310.2905712-14612-55308479566594="` echo /root/.ansible/tmp/ansible-tmp-1726853310.2905712-14612-55308479566594 `" ) && sleep 0' 13273 1726853310.29561: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853310.29575: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration <<< 13273 1726853310.29593: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853310.29637: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853310.29655: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853310.29716: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853310.31656: stdout chunk (state=3): >>>ansible-tmp-1726853310.2905712-14612-55308479566594=/root/.ansible/tmp/ansible-tmp-1726853310.2905712-14612-55308479566594 <<< 13273 1726853310.31762: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853310.31789: stderr chunk (state=3): >>><<< 13273 1726853310.31792: stdout chunk (state=3): >>><<< 13273 1726853310.31806: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853310.2905712-14612-55308479566594=/root/.ansible/tmp/ansible-tmp-1726853310.2905712-14612-55308479566594 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853310.31832: variable 'ansible_module_compression' from source: unknown 13273 1726853310.31877: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13273pukop8ph/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 13273 1726853310.31925: variable 'ansible_facts' from source: unknown 13273 1726853310.32059: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853310.2905712-14612-55308479566594/AnsiballZ_systemd.py 13273 1726853310.32162: Sending initial data 13273 1726853310.32166: Sent initial data (155 bytes) 13273 1726853310.32605: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853310.32609: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 13273 1726853310.32611: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853310.32618: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853310.32621: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853310.32673: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853310.32680: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853310.32682: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853310.32737: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853310.34351: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 13273 1726853310.34355: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13273 1726853310.34402: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13273 1726853310.34464: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13273pukop8ph/tmphhz53pn1 /root/.ansible/tmp/ansible-tmp-1726853310.2905712-14612-55308479566594/AnsiballZ_systemd.py <<< 13273 1726853310.34467: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853310.2905712-14612-55308479566594/AnsiballZ_systemd.py" <<< 13273 1726853310.34516: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13273pukop8ph/tmphhz53pn1" to remote "/root/.ansible/tmp/ansible-tmp-1726853310.2905712-14612-55308479566594/AnsiballZ_systemd.py" <<< 13273 1726853310.34522: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853310.2905712-14612-55308479566594/AnsiballZ_systemd.py" <<< 13273 1726853310.35659: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853310.35699: stderr chunk (state=3): >>><<< 13273 1726853310.35702: stdout chunk (state=3): >>><<< 13273 1726853310.35725: done transferring module to remote 13273 1726853310.35733: _low_level_execute_command(): starting 13273 1726853310.35738: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853310.2905712-14612-55308479566594/ /root/.ansible/tmp/ansible-tmp-1726853310.2905712-14612-55308479566594/AnsiballZ_systemd.py && sleep 0' 13273 1726853310.36178: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853310.36182: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853310.36184: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853310.36186: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 13273 1726853310.36188: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853310.36235: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853310.36238: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853310.36299: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853310.38142: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853310.38168: stderr chunk (state=3): >>><<< 13273 1726853310.38173: stdout chunk (state=3): >>><<< 13273 1726853310.38184: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853310.38187: _low_level_execute_command(): starting 13273 1726853310.38192: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853310.2905712-14612-55308479566594/AnsiballZ_systemd.py && sleep 0' 13273 1726853310.38618: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853310.38622: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853310.38624: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration <<< 13273 1726853310.38627: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13273 1726853310.38628: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853310.38686: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853310.38694: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853310.38696: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853310.38752: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853310.68261: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "705", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:21:20 EDT", "ExecMainStartTimestampMonotonic": "24298536", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:21:20 EDT", "ExecMainHandoffTimestampMonotonic": "24318182", "ExecMainPID": "705", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10481664", "MemoryPeak": "14114816", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3302612992", "EffectiveMemoryMax": "3702865920", "EffectiveMemoryHigh": "3702865920", "CPUUsageNSec": "988197000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpRe<<< 13273 1726853310.68281: stdout chunk (state=3): >>>ceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice sysinit.target dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "network.target shutdown.target cloud-init.service NetworkManager-wait-online.service multi-user.target", "After": "sysinit<<< 13273 1726853310.68291: stdout chunk (state=3): >>>.target systemd-journald.socket basic.target cloud-init-local.service network-pre.target dbus.socket system.slice dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:26:57 EDT", "StateChangeTimestampMonotonic": "361843458", "InactiveExitTimestamp": "Fri 2024-09-20 13:21:20 EDT", "InactiveExitTimestampMonotonic": "24299070", "ActiveEnterTimestamp": "Fri 2024-09-20 13:21:20 EDT", "ActiveEnterTimestampMonotonic": "24855925", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:21:20 EDT", "ConditionTimestampMonotonic": "24297535", "AssertTimestamp": "Fri 2024-09-20 13:21:20 EDT", "AssertTimestampMonotonic": "24297537", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "125a1bdc44cb4bffa8aeca788d2f2fa3", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 13273 1726853310.70279: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 13273 1726853310.70307: stderr chunk (state=3): >>><<< 13273 1726853310.70310: stdout chunk (state=3): >>><<< 13273 1726853310.70330: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "705", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:21:20 EDT", "ExecMainStartTimestampMonotonic": "24298536", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:21:20 EDT", "ExecMainHandoffTimestampMonotonic": "24318182", "ExecMainPID": "705", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10481664", "MemoryPeak": "14114816", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3302612992", "EffectiveMemoryMax": "3702865920", "EffectiveMemoryHigh": "3702865920", "CPUUsageNSec": "988197000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice sysinit.target dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "network.target shutdown.target cloud-init.service NetworkManager-wait-online.service multi-user.target", "After": "sysinit.target systemd-journald.socket basic.target cloud-init-local.service network-pre.target dbus.socket system.slice dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:26:57 EDT", "StateChangeTimestampMonotonic": "361843458", "InactiveExitTimestamp": "Fri 2024-09-20 13:21:20 EDT", "InactiveExitTimestampMonotonic": "24299070", "ActiveEnterTimestamp": "Fri 2024-09-20 13:21:20 EDT", "ActiveEnterTimestampMonotonic": "24855925", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:21:20 EDT", "ConditionTimestampMonotonic": "24297535", "AssertTimestamp": "Fri 2024-09-20 13:21:20 EDT", "AssertTimestampMonotonic": "24297537", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "125a1bdc44cb4bffa8aeca788d2f2fa3", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 13273 1726853310.70456: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853310.2905712-14612-55308479566594/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13273 1726853310.70470: _low_level_execute_command(): starting 13273 1726853310.70475: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853310.2905712-14612-55308479566594/ > /dev/null 2>&1 && sleep 0' 13273 1726853310.70927: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853310.70932: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 13273 1726853310.70934: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 13273 1726853310.70936: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853310.70938: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853310.71003: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853310.71006: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853310.71008: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853310.71063: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853310.72959: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853310.72989: stderr chunk (state=3): >>><<< 13273 1726853310.72992: stdout chunk (state=3): >>><<< 13273 1726853310.73005: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853310.73011: handler run complete 13273 1726853310.73056: attempt loop complete, returning result 13273 1726853310.73059: _execute() done 13273 1726853310.73061: dumping result to json 13273 1726853310.73074: done dumping result, returning 13273 1726853310.73084: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [02083763-bbaf-5fc3-657d-000000000087] 13273 1726853310.73087: sending task result for task 02083763-bbaf-5fc3-657d-000000000087 13273 1726853310.73286: done sending task result for task 02083763-bbaf-5fc3-657d-000000000087 13273 1726853310.73289: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 13273 1726853310.73341: no more pending results, returning what we have 13273 1726853310.73347: results queue empty 13273 1726853310.73348: checking for any_errors_fatal 13273 1726853310.73354: done checking for any_errors_fatal 13273 1726853310.73355: checking for max_fail_percentage 13273 1726853310.73356: done checking for max_fail_percentage 13273 1726853310.73357: checking to see if all hosts have failed and the running result is not ok 13273 1726853310.73358: done checking to see if all hosts have failed 13273 1726853310.73358: getting the remaining hosts for this loop 13273 1726853310.73360: done getting the remaining hosts for this loop 13273 1726853310.73363: getting the next task for host managed_node3 13273 1726853310.73368: done getting next task for host managed_node3 13273 1726853310.73374: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 13273 1726853310.73376: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853310.73388: getting variables 13273 1726853310.73389: in VariableManager get_vars() 13273 1726853310.73485: Calling all_inventory to load vars for managed_node3 13273 1726853310.73488: Calling groups_inventory to load vars for managed_node3 13273 1726853310.73490: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853310.73501: Calling all_plugins_play to load vars for managed_node3 13273 1726853310.73504: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853310.73507: Calling groups_plugins_play to load vars for managed_node3 13273 1726853310.74387: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853310.75238: done with get_vars() 13273 1726853310.75257: done getting variables 13273 1726853310.75302: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 13:28:30 -0400 (0:00:00.598) 0:00:28.642 ****** 13273 1726853310.75329: entering _queue_task() for managed_node3/service 13273 1726853310.75652: worker is 1 (out of 1 available) 13273 1726853310.75664: exiting _queue_task() for managed_node3/service 13273 1726853310.75878: done queuing things up, now waiting for results queue to drain 13273 1726853310.75880: waiting for pending results... 13273 1726853310.76204: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 13273 1726853310.76216: in run() - task 02083763-bbaf-5fc3-657d-000000000088 13273 1726853310.76220: variable 'ansible_search_path' from source: unknown 13273 1726853310.76223: variable 'ansible_search_path' from source: unknown 13273 1726853310.76231: calling self._execute() 13273 1726853310.76339: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853310.76352: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853310.76367: variable 'omit' from source: magic vars 13273 1726853310.76752: variable 'ansible_distribution_major_version' from source: facts 13273 1726853310.76768: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853310.76947: variable 'network_provider' from source: set_fact 13273 1726853310.76950: Evaluated conditional (network_provider == "nm"): True 13273 1726853310.77029: variable '__network_wpa_supplicant_required' from source: role '' defaults 13273 1726853310.77127: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 13273 1726853310.77304: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13273 1726853310.79555: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13273 1726853310.79574: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13273 1726853310.79618: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13273 1726853310.79662: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13273 1726853310.79694: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13273 1726853310.80150: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853310.80187: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853310.80223: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853310.80279: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853310.80302: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853310.80476: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853310.80480: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853310.80482: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853310.80485: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853310.80487: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853310.80506: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853310.80534: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853310.80565: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853310.80613: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853310.80633: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853310.80779: variable 'network_connections' from source: task vars 13273 1726853310.80797: variable 'controller_profile' from source: play vars 13273 1726853310.80875: variable 'controller_profile' from source: play vars 13273 1726853310.80955: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13273 1726853310.81133: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13273 1726853310.81179: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13273 1726853310.81219: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13273 1726853310.81246: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13273 1726853310.81291: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13273 1726853310.81315: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13273 1726853310.81332: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853310.81354: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13273 1726853310.81392: variable '__network_wireless_connections_defined' from source: role '' defaults 13273 1726853310.81555: variable 'network_connections' from source: task vars 13273 1726853310.81560: variable 'controller_profile' from source: play vars 13273 1726853310.81606: variable 'controller_profile' from source: play vars 13273 1726853310.81627: Evaluated conditional (__network_wpa_supplicant_required): False 13273 1726853310.81632: when evaluation is False, skipping this task 13273 1726853310.81634: _execute() done 13273 1726853310.81637: dumping result to json 13273 1726853310.81640: done dumping result, returning 13273 1726853310.81649: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [02083763-bbaf-5fc3-657d-000000000088] 13273 1726853310.81660: sending task result for task 02083763-bbaf-5fc3-657d-000000000088 13273 1726853310.81743: done sending task result for task 02083763-bbaf-5fc3-657d-000000000088 13273 1726853310.81746: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 13273 1726853310.81796: no more pending results, returning what we have 13273 1726853310.81799: results queue empty 13273 1726853310.81800: checking for any_errors_fatal 13273 1726853310.81823: done checking for any_errors_fatal 13273 1726853310.81824: checking for max_fail_percentage 13273 1726853310.81826: done checking for max_fail_percentage 13273 1726853310.81826: checking to see if all hosts have failed and the running result is not ok 13273 1726853310.81827: done checking to see if all hosts have failed 13273 1726853310.81827: getting the remaining hosts for this loop 13273 1726853310.81829: done getting the remaining hosts for this loop 13273 1726853310.81832: getting the next task for host managed_node3 13273 1726853310.81838: done getting next task for host managed_node3 13273 1726853310.81842: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 13273 1726853310.81845: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853310.81862: getting variables 13273 1726853310.81864: in VariableManager get_vars() 13273 1726853310.81952: Calling all_inventory to load vars for managed_node3 13273 1726853310.81956: Calling groups_inventory to load vars for managed_node3 13273 1726853310.81958: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853310.81967: Calling all_plugins_play to load vars for managed_node3 13273 1726853310.81969: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853310.81975: Calling groups_plugins_play to load vars for managed_node3 13273 1726853310.83829: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853310.87601: done with get_vars() 13273 1726853310.87634: done getting variables 13273 1726853310.87696: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 13:28:30 -0400 (0:00:00.123) 0:00:28.766 ****** 13273 1726853310.87729: entering _queue_task() for managed_node3/service 13273 1726853310.88919: worker is 1 (out of 1 available) 13273 1726853310.88932: exiting _queue_task() for managed_node3/service 13273 1726853310.88944: done queuing things up, now waiting for results queue to drain 13273 1726853310.88945: waiting for pending results... 13273 1726853310.89391: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service 13273 1726853310.89400: in run() - task 02083763-bbaf-5fc3-657d-000000000089 13273 1726853310.89416: variable 'ansible_search_path' from source: unknown 13273 1726853310.89425: variable 'ansible_search_path' from source: unknown 13273 1726853310.89468: calling self._execute() 13273 1726853310.89567: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853310.89582: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853310.89596: variable 'omit' from source: magic vars 13273 1726853310.89960: variable 'ansible_distribution_major_version' from source: facts 13273 1726853310.89978: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853310.90093: variable 'network_provider' from source: set_fact 13273 1726853310.90104: Evaluated conditional (network_provider == "initscripts"): False 13273 1726853310.90111: when evaluation is False, skipping this task 13273 1726853310.90176: _execute() done 13273 1726853310.90179: dumping result to json 13273 1726853310.90181: done dumping result, returning 13273 1726853310.90184: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service [02083763-bbaf-5fc3-657d-000000000089] 13273 1726853310.90186: sending task result for task 02083763-bbaf-5fc3-657d-000000000089 13273 1726853310.90259: done sending task result for task 02083763-bbaf-5fc3-657d-000000000089 13273 1726853310.90263: WORKER PROCESS EXITING skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 13273 1726853310.90311: no more pending results, returning what we have 13273 1726853310.90315: results queue empty 13273 1726853310.90316: checking for any_errors_fatal 13273 1726853310.90324: done checking for any_errors_fatal 13273 1726853310.90325: checking for max_fail_percentage 13273 1726853310.90327: done checking for max_fail_percentage 13273 1726853310.90328: checking to see if all hosts have failed and the running result is not ok 13273 1726853310.90328: done checking to see if all hosts have failed 13273 1726853310.90329: getting the remaining hosts for this loop 13273 1726853310.90330: done getting the remaining hosts for this loop 13273 1726853310.90333: getting the next task for host managed_node3 13273 1726853310.90339: done getting next task for host managed_node3 13273 1726853310.90344: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 13273 1726853310.90347: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853310.90366: getting variables 13273 1726853310.90367: in VariableManager get_vars() 13273 1726853310.90424: Calling all_inventory to load vars for managed_node3 13273 1726853310.90426: Calling groups_inventory to load vars for managed_node3 13273 1726853310.90429: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853310.90439: Calling all_plugins_play to load vars for managed_node3 13273 1726853310.90441: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853310.90444: Calling groups_plugins_play to load vars for managed_node3 13273 1726853310.91841: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853310.93447: done with get_vars() 13273 1726853310.93474: done getting variables 13273 1726853310.93534: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 13:28:30 -0400 (0:00:00.058) 0:00:28.824 ****** 13273 1726853310.93570: entering _queue_task() for managed_node3/copy 13273 1726853310.93914: worker is 1 (out of 1 available) 13273 1726853310.93926: exiting _queue_task() for managed_node3/copy 13273 1726853310.93938: done queuing things up, now waiting for results queue to drain 13273 1726853310.93939: waiting for pending results... 13273 1726853310.94299: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 13273 1726853310.94373: in run() - task 02083763-bbaf-5fc3-657d-00000000008a 13273 1726853310.94397: variable 'ansible_search_path' from source: unknown 13273 1726853310.94576: variable 'ansible_search_path' from source: unknown 13273 1726853310.94580: calling self._execute() 13273 1726853310.94583: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853310.94585: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853310.94588: variable 'omit' from source: magic vars 13273 1726853310.94941: variable 'ansible_distribution_major_version' from source: facts 13273 1726853310.94960: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853310.95086: variable 'network_provider' from source: set_fact 13273 1726853310.95097: Evaluated conditional (network_provider == "initscripts"): False 13273 1726853310.95104: when evaluation is False, skipping this task 13273 1726853310.95111: _execute() done 13273 1726853310.95119: dumping result to json 13273 1726853310.95126: done dumping result, returning 13273 1726853310.95146: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [02083763-bbaf-5fc3-657d-00000000008a] 13273 1726853310.95157: sending task result for task 02083763-bbaf-5fc3-657d-00000000008a 13273 1726853310.95480: done sending task result for task 02083763-bbaf-5fc3-657d-00000000008a 13273 1726853310.95483: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 13273 1726853310.95525: no more pending results, returning what we have 13273 1726853310.95528: results queue empty 13273 1726853310.95529: checking for any_errors_fatal 13273 1726853310.95533: done checking for any_errors_fatal 13273 1726853310.95533: checking for max_fail_percentage 13273 1726853310.95535: done checking for max_fail_percentage 13273 1726853310.95535: checking to see if all hosts have failed and the running result is not ok 13273 1726853310.95536: done checking to see if all hosts have failed 13273 1726853310.95537: getting the remaining hosts for this loop 13273 1726853310.95538: done getting the remaining hosts for this loop 13273 1726853310.95541: getting the next task for host managed_node3 13273 1726853310.95549: done getting next task for host managed_node3 13273 1726853310.95553: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 13273 1726853310.95555: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853310.95574: getting variables 13273 1726853310.95576: in VariableManager get_vars() 13273 1726853310.95630: Calling all_inventory to load vars for managed_node3 13273 1726853310.95633: Calling groups_inventory to load vars for managed_node3 13273 1726853310.95636: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853310.95648: Calling all_plugins_play to load vars for managed_node3 13273 1726853310.95651: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853310.95653: Calling groups_plugins_play to load vars for managed_node3 13273 1726853310.98460: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853311.00486: done with get_vars() 13273 1726853311.00514: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 13:28:31 -0400 (0:00:00.070) 0:00:28.895 ****** 13273 1726853311.00607: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 13273 1726853311.00965: worker is 1 (out of 1 available) 13273 1726853311.00986: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 13273 1726853311.01041: done queuing things up, now waiting for results queue to drain 13273 1726853311.01043: waiting for pending results... 13273 1726853311.01264: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 13273 1726853311.01398: in run() - task 02083763-bbaf-5fc3-657d-00000000008b 13273 1726853311.01422: variable 'ansible_search_path' from source: unknown 13273 1726853311.01431: variable 'ansible_search_path' from source: unknown 13273 1726853311.01477: calling self._execute() 13273 1726853311.01976: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853311.01980: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853311.01983: variable 'omit' from source: magic vars 13273 1726853311.02551: variable 'ansible_distribution_major_version' from source: facts 13273 1726853311.02569: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853311.02583: variable 'omit' from source: magic vars 13273 1726853311.02665: variable 'omit' from source: magic vars 13273 1726853311.03023: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13273 1726853311.05057: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13273 1726853311.05181: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13273 1726853311.05222: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13273 1726853311.05296: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13273 1726853311.05359: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13273 1726853311.05687: variable 'network_provider' from source: set_fact 13273 1726853311.05795: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853311.06081: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853311.06085: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853311.06097: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853311.06115: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853311.06195: variable 'omit' from source: magic vars 13273 1726853311.06417: variable 'omit' from source: magic vars 13273 1726853311.06524: variable 'network_connections' from source: task vars 13273 1726853311.06590: variable 'controller_profile' from source: play vars 13273 1726853311.06834: variable 'controller_profile' from source: play vars 13273 1726853311.06981: variable 'omit' from source: magic vars 13273 1726853311.07276: variable '__lsr_ansible_managed' from source: task vars 13273 1726853311.07279: variable '__lsr_ansible_managed' from source: task vars 13273 1726853311.07427: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 13273 1726853311.07863: Loaded config def from plugin (lookup/template) 13273 1726853311.07916: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 13273 1726853311.07950: File lookup term: get_ansible_managed.j2 13273 1726853311.07960: variable 'ansible_search_path' from source: unknown 13273 1726853311.08003: evaluation_path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 13273 1726853311.08021: search_path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 13273 1726853311.08042: variable 'ansible_search_path' from source: unknown 13273 1726853311.20193: variable 'ansible_managed' from source: unknown 13273 1726853311.20358: variable 'omit' from source: magic vars 13273 1726853311.20466: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13273 1726853311.20575: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13273 1726853311.20597: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13273 1726853311.20618: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853311.20633: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853311.20657: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13273 1726853311.20666: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853311.20677: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853311.20765: Set connection var ansible_connection to ssh 13273 1726853311.20784: Set connection var ansible_module_compression to ZIP_DEFLATED 13273 1726853311.20794: Set connection var ansible_shell_executable to /bin/sh 13273 1726853311.20801: Set connection var ansible_shell_type to sh 13273 1726853311.20811: Set connection var ansible_pipelining to False 13273 1726853311.20820: Set connection var ansible_timeout to 10 13273 1726853311.20852: variable 'ansible_shell_executable' from source: unknown 13273 1726853311.20861: variable 'ansible_connection' from source: unknown 13273 1726853311.20869: variable 'ansible_module_compression' from source: unknown 13273 1726853311.20881: variable 'ansible_shell_type' from source: unknown 13273 1726853311.20890: variable 'ansible_shell_executable' from source: unknown 13273 1726853311.20898: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853311.20907: variable 'ansible_pipelining' from source: unknown 13273 1726853311.20914: variable 'ansible_timeout' from source: unknown 13273 1726853311.20921: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853311.21047: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 13273 1726853311.21075: variable 'omit' from source: magic vars 13273 1726853311.21088: starting attempt loop 13273 1726853311.21095: running the handler 13273 1726853311.21110: _low_level_execute_command(): starting 13273 1726853311.21121: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13273 1726853311.21758: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13273 1726853311.21880: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853311.21885: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853311.21908: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853311.21926: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853311.22028: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853311.23741: stdout chunk (state=3): >>>/root <<< 13273 1726853311.24006: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853311.24010: stdout chunk (state=3): >>><<< 13273 1726853311.24013: stderr chunk (state=3): >>><<< 13273 1726853311.24015: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853311.24018: _low_level_execute_command(): starting 13273 1726853311.24022: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853311.2391882-14638-112330604490216 `" && echo ansible-tmp-1726853311.2391882-14638-112330604490216="` echo /root/.ansible/tmp/ansible-tmp-1726853311.2391882-14638-112330604490216 `" ) && sleep 0' 13273 1726853311.25592: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853311.27559: stdout chunk (state=3): >>>ansible-tmp-1726853311.2391882-14638-112330604490216=/root/.ansible/tmp/ansible-tmp-1726853311.2391882-14638-112330604490216 <<< 13273 1726853311.27661: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853311.27695: stderr chunk (state=3): >>><<< 13273 1726853311.27704: stdout chunk (state=3): >>><<< 13273 1726853311.27728: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853311.2391882-14638-112330604490216=/root/.ansible/tmp/ansible-tmp-1726853311.2391882-14638-112330604490216 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853311.27776: variable 'ansible_module_compression' from source: unknown 13273 1726853311.27818: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13273pukop8ph/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 13273 1726853311.27868: variable 'ansible_facts' from source: unknown 13273 1726853311.28014: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853311.2391882-14638-112330604490216/AnsiballZ_network_connections.py 13273 1726853311.28216: Sending initial data 13273 1726853311.28227: Sent initial data (168 bytes) 13273 1726853311.28759: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853311.28775: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853311.28798: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13273 1726853311.28873: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853311.28894: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853311.28910: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853311.28931: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853311.29020: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853311.30646: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13273 1726853311.30728: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13273 1726853311.30805: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13273pukop8ph/tmpiix91hyw /root/.ansible/tmp/ansible-tmp-1726853311.2391882-14638-112330604490216/AnsiballZ_network_connections.py <<< 13273 1726853311.30835: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853311.2391882-14638-112330604490216/AnsiballZ_network_connections.py" <<< 13273 1726853311.30888: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13273pukop8ph/tmpiix91hyw" to remote "/root/.ansible/tmp/ansible-tmp-1726853311.2391882-14638-112330604490216/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853311.2391882-14638-112330604490216/AnsiballZ_network_connections.py" <<< 13273 1726853311.31980: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853311.32079: stderr chunk (state=3): >>><<< 13273 1726853311.32088: stdout chunk (state=3): >>><<< 13273 1726853311.32116: done transferring module to remote 13273 1726853311.32132: _low_level_execute_command(): starting 13273 1726853311.32141: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853311.2391882-14638-112330604490216/ /root/.ansible/tmp/ansible-tmp-1726853311.2391882-14638-112330604490216/AnsiballZ_network_connections.py && sleep 0' 13273 1726853311.32981: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853311.33085: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853311.33177: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853311.35124: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853311.35127: stderr chunk (state=3): >>><<< 13273 1726853311.35147: stdout chunk (state=3): >>><<< 13273 1726853311.35194: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853311.35199: _low_level_execute_command(): starting 13273 1726853311.35279: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853311.2391882-14638-112330604490216/AnsiballZ_network_connections.py && sleep 0' 13273 1726853311.35863: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13273 1726853311.35880: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853311.35893: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853311.35924: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13273 1726853311.36033: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853311.36057: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853311.36166: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853311.83624: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_2i5w_0u3/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_2i5w_0u3/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0/a541bcb3-3a19-4ffe-82d9-f7e984395e25: error=unknown<<< 13273 1726853311.83673: stdout chunk (state=3): >>> <<< 13273 1726853311.83861: stdout chunk (state=3): >>> <<< 13273 1726853311.83995: stdout chunk (state=3): >>>{"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 13273 1726853311.85935: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853311.85962: stderr chunk (state=3): >>>Shared connection to 10.31.11.217 closed. <<< 13273 1726853311.86039: stderr chunk (state=3): >>><<< 13273 1726853311.86046: stdout chunk (state=3): >>><<< 13273 1726853311.86189: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_2i5w_0u3/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_2i5w_0u3/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0/a541bcb3-3a19-4ffe-82d9-f7e984395e25: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 13273 1726853311.86193: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'bond0', 'persistent_state': 'absent', 'state': 'down'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853311.2391882-14638-112330604490216/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13273 1726853311.86196: _low_level_execute_command(): starting 13273 1726853311.86199: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853311.2391882-14638-112330604490216/ > /dev/null 2>&1 && sleep 0' 13273 1726853311.86926: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13273 1726853311.86938: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853311.86958: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853311.87094: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853311.87197: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853311.87225: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853311.87428: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853311.89369: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853311.89375: stdout chunk (state=3): >>><<< 13273 1726853311.89378: stderr chunk (state=3): >>><<< 13273 1726853311.89394: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853311.89405: handler run complete 13273 1726853311.89576: attempt loop complete, returning result 13273 1726853311.89579: _execute() done 13273 1726853311.89586: dumping result to json 13273 1726853311.89588: done dumping result, returning 13273 1726853311.89591: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [02083763-bbaf-5fc3-657d-00000000008b] 13273 1726853311.89593: sending task result for task 02083763-bbaf-5fc3-657d-00000000008b 13273 1726853311.89664: done sending task result for task 02083763-bbaf-5fc3-657d-00000000008b 13273 1726853311.89667: WORKER PROCESS EXITING changed: [managed_node3] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "bond0", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 13273 1726853311.89779: no more pending results, returning what we have 13273 1726853311.89783: results queue empty 13273 1726853311.89784: checking for any_errors_fatal 13273 1726853311.89789: done checking for any_errors_fatal 13273 1726853311.89790: checking for max_fail_percentage 13273 1726853311.89792: done checking for max_fail_percentage 13273 1726853311.89792: checking to see if all hosts have failed and the running result is not ok 13273 1726853311.89793: done checking to see if all hosts have failed 13273 1726853311.89794: getting the remaining hosts for this loop 13273 1726853311.89795: done getting the remaining hosts for this loop 13273 1726853311.89798: getting the next task for host managed_node3 13273 1726853311.89804: done getting next task for host managed_node3 13273 1726853311.89808: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 13273 1726853311.89811: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853311.89822: getting variables 13273 1726853311.89824: in VariableManager get_vars() 13273 1726853311.90084: Calling all_inventory to load vars for managed_node3 13273 1726853311.90087: Calling groups_inventory to load vars for managed_node3 13273 1726853311.90090: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853311.90099: Calling all_plugins_play to load vars for managed_node3 13273 1726853311.90102: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853311.90105: Calling groups_plugins_play to load vars for managed_node3 13273 1726853311.92681: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853311.94633: done with get_vars() 13273 1726853311.94657: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 13:28:31 -0400 (0:00:00.941) 0:00:29.836 ****** 13273 1726853311.94748: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_state 13273 1726853311.95215: worker is 1 (out of 1 available) 13273 1726853311.95380: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_state 13273 1726853311.95395: done queuing things up, now waiting for results queue to drain 13273 1726853311.95396: waiting for pending results... 13273 1726853311.95993: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state 13273 1726853311.96001: in run() - task 02083763-bbaf-5fc3-657d-00000000008c 13273 1726853311.96005: variable 'ansible_search_path' from source: unknown 13273 1726853311.96179: variable 'ansible_search_path' from source: unknown 13273 1726853311.96215: calling self._execute() 13273 1726853311.96311: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853311.96318: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853311.96326: variable 'omit' from source: magic vars 13273 1726853311.97055: variable 'ansible_distribution_major_version' from source: facts 13273 1726853311.97074: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853311.97226: variable 'network_state' from source: role '' defaults 13273 1726853311.97241: Evaluated conditional (network_state != {}): False 13273 1726853311.97253: when evaluation is False, skipping this task 13273 1726853311.97261: _execute() done 13273 1726853311.97268: dumping result to json 13273 1726853311.97278: done dumping result, returning 13273 1726853311.97290: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state [02083763-bbaf-5fc3-657d-00000000008c] 13273 1726853311.97305: sending task result for task 02083763-bbaf-5fc3-657d-00000000008c 13273 1726853311.97577: done sending task result for task 02083763-bbaf-5fc3-657d-00000000008c 13273 1726853311.97581: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 13273 1726853311.97647: no more pending results, returning what we have 13273 1726853311.97651: results queue empty 13273 1726853311.97652: checking for any_errors_fatal 13273 1726853311.97665: done checking for any_errors_fatal 13273 1726853311.97666: checking for max_fail_percentage 13273 1726853311.97669: done checking for max_fail_percentage 13273 1726853311.97669: checking to see if all hosts have failed and the running result is not ok 13273 1726853311.97670: done checking to see if all hosts have failed 13273 1726853311.97673: getting the remaining hosts for this loop 13273 1726853311.97674: done getting the remaining hosts for this loop 13273 1726853311.97678: getting the next task for host managed_node3 13273 1726853311.97685: done getting next task for host managed_node3 13273 1726853311.97689: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 13273 1726853311.97698: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853311.97720: getting variables 13273 1726853311.97722: in VariableManager get_vars() 13273 1726853311.97886: Calling all_inventory to load vars for managed_node3 13273 1726853311.97889: Calling groups_inventory to load vars for managed_node3 13273 1726853311.97892: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853311.97900: Calling all_plugins_play to load vars for managed_node3 13273 1726853311.97903: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853311.97906: Calling groups_plugins_play to load vars for managed_node3 13273 1726853311.99341: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853312.01108: done with get_vars() 13273 1726853312.01131: done getting variables 13273 1726853312.01192: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 13:28:32 -0400 (0:00:00.064) 0:00:29.901 ****** 13273 1726853312.01225: entering _queue_task() for managed_node3/debug 13273 1726853312.02000: worker is 1 (out of 1 available) 13273 1726853312.02015: exiting _queue_task() for managed_node3/debug 13273 1726853312.02086: done queuing things up, now waiting for results queue to drain 13273 1726853312.02087: waiting for pending results... 13273 1726853312.02381: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 13273 1726853312.02406: in run() - task 02083763-bbaf-5fc3-657d-00000000008d 13273 1726853312.02476: variable 'ansible_search_path' from source: unknown 13273 1726853312.02480: variable 'ansible_search_path' from source: unknown 13273 1726853312.02484: calling self._execute() 13273 1726853312.02540: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853312.02547: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853312.02556: variable 'omit' from source: magic vars 13273 1726853312.02901: variable 'ansible_distribution_major_version' from source: facts 13273 1726853312.02908: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853312.03015: variable 'omit' from source: magic vars 13273 1726853312.03019: variable 'omit' from source: magic vars 13273 1726853312.03022: variable 'omit' from source: magic vars 13273 1726853312.03040: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13273 1726853312.03076: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13273 1726853312.03094: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13273 1726853312.03112: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853312.03123: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853312.03153: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13273 1726853312.03157: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853312.03159: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853312.03249: Set connection var ansible_connection to ssh 13273 1726853312.03259: Set connection var ansible_module_compression to ZIP_DEFLATED 13273 1726853312.03264: Set connection var ansible_shell_executable to /bin/sh 13273 1726853312.03267: Set connection var ansible_shell_type to sh 13273 1726853312.03274: Set connection var ansible_pipelining to False 13273 1726853312.03280: Set connection var ansible_timeout to 10 13273 1726853312.03303: variable 'ansible_shell_executable' from source: unknown 13273 1726853312.03306: variable 'ansible_connection' from source: unknown 13273 1726853312.03309: variable 'ansible_module_compression' from source: unknown 13273 1726853312.03311: variable 'ansible_shell_type' from source: unknown 13273 1726853312.03314: variable 'ansible_shell_executable' from source: unknown 13273 1726853312.03316: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853312.03318: variable 'ansible_pipelining' from source: unknown 13273 1726853312.03320: variable 'ansible_timeout' from source: unknown 13273 1726853312.03340: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853312.03558: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13273 1726853312.03564: variable 'omit' from source: magic vars 13273 1726853312.03567: starting attempt loop 13273 1726853312.03569: running the handler 13273 1726853312.03667: variable '__network_connections_result' from source: set_fact 13273 1726853312.03673: handler run complete 13273 1726853312.03676: attempt loop complete, returning result 13273 1726853312.03679: _execute() done 13273 1726853312.03681: dumping result to json 13273 1726853312.03683: done dumping result, returning 13273 1726853312.03685: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [02083763-bbaf-5fc3-657d-00000000008d] 13273 1726853312.03687: sending task result for task 02083763-bbaf-5fc3-657d-00000000008d 13273 1726853312.03764: done sending task result for task 02083763-bbaf-5fc3-657d-00000000008d 13273 1726853312.03767: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result.stderr_lines": [ "" ] } 13273 1726853312.03859: no more pending results, returning what we have 13273 1726853312.03862: results queue empty 13273 1726853312.03863: checking for any_errors_fatal 13273 1726853312.03867: done checking for any_errors_fatal 13273 1726853312.03868: checking for max_fail_percentage 13273 1726853312.03870: done checking for max_fail_percentage 13273 1726853312.03872: checking to see if all hosts have failed and the running result is not ok 13273 1726853312.03873: done checking to see if all hosts have failed 13273 1726853312.03874: getting the remaining hosts for this loop 13273 1726853312.03875: done getting the remaining hosts for this loop 13273 1726853312.03878: getting the next task for host managed_node3 13273 1726853312.03888: done getting next task for host managed_node3 13273 1726853312.03891: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 13273 1726853312.03893: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853312.03903: getting variables 13273 1726853312.03904: in VariableManager get_vars() 13273 1726853312.03946: Calling all_inventory to load vars for managed_node3 13273 1726853312.03949: Calling groups_inventory to load vars for managed_node3 13273 1726853312.03951: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853312.03958: Calling all_plugins_play to load vars for managed_node3 13273 1726853312.03961: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853312.03963: Calling groups_plugins_play to load vars for managed_node3 13273 1726853312.05526: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853312.12282: done with get_vars() 13273 1726853312.12306: done getting variables 13273 1726853312.12357: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 13:28:32 -0400 (0:00:00.111) 0:00:30.013 ****** 13273 1726853312.12386: entering _queue_task() for managed_node3/debug 13273 1726853312.12730: worker is 1 (out of 1 available) 13273 1726853312.12742: exiting _queue_task() for managed_node3/debug 13273 1726853312.12756: done queuing things up, now waiting for results queue to drain 13273 1726853312.12757: waiting for pending results... 13273 1726853312.13148: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 13273 1726853312.13236: in run() - task 02083763-bbaf-5fc3-657d-00000000008e 13273 1726853312.13261: variable 'ansible_search_path' from source: unknown 13273 1726853312.13353: variable 'ansible_search_path' from source: unknown 13273 1726853312.13358: calling self._execute() 13273 1726853312.13418: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853312.13432: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853312.13450: variable 'omit' from source: magic vars 13273 1726853312.13835: variable 'ansible_distribution_major_version' from source: facts 13273 1726853312.13858: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853312.13873: variable 'omit' from source: magic vars 13273 1726853312.13940: variable 'omit' from source: magic vars 13273 1726853312.13989: variable 'omit' from source: magic vars 13273 1726853312.14045: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13273 1726853312.14094: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13273 1726853312.14177: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13273 1726853312.14181: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853312.14184: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853312.14209: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13273 1726853312.14226: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853312.14235: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853312.14348: Set connection var ansible_connection to ssh 13273 1726853312.14365: Set connection var ansible_module_compression to ZIP_DEFLATED 13273 1726853312.14378: Set connection var ansible_shell_executable to /bin/sh 13273 1726853312.14441: Set connection var ansible_shell_type to sh 13273 1726853312.14447: Set connection var ansible_pipelining to False 13273 1726853312.14450: Set connection var ansible_timeout to 10 13273 1726853312.14452: variable 'ansible_shell_executable' from source: unknown 13273 1726853312.14454: variable 'ansible_connection' from source: unknown 13273 1726853312.14457: variable 'ansible_module_compression' from source: unknown 13273 1726853312.14459: variable 'ansible_shell_type' from source: unknown 13273 1726853312.14465: variable 'ansible_shell_executable' from source: unknown 13273 1726853312.14476: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853312.14485: variable 'ansible_pipelining' from source: unknown 13273 1726853312.14491: variable 'ansible_timeout' from source: unknown 13273 1726853312.14499: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853312.14663: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13273 1726853312.14769: variable 'omit' from source: magic vars 13273 1726853312.14774: starting attempt loop 13273 1726853312.14778: running the handler 13273 1726853312.14780: variable '__network_connections_result' from source: set_fact 13273 1726853312.14840: variable '__network_connections_result' from source: set_fact 13273 1726853312.14968: handler run complete 13273 1726853312.15007: attempt loop complete, returning result 13273 1726853312.15015: _execute() done 13273 1726853312.15024: dumping result to json 13273 1726853312.15034: done dumping result, returning 13273 1726853312.15050: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [02083763-bbaf-5fc3-657d-00000000008e] 13273 1726853312.15060: sending task result for task 02083763-bbaf-5fc3-657d-00000000008e 13273 1726853312.15358: done sending task result for task 02083763-bbaf-5fc3-657d-00000000008e 13273 1726853312.15361: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "bond0", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 13273 1726853312.15453: no more pending results, returning what we have 13273 1726853312.15457: results queue empty 13273 1726853312.15458: checking for any_errors_fatal 13273 1726853312.15467: done checking for any_errors_fatal 13273 1726853312.15468: checking for max_fail_percentage 13273 1726853312.15470: done checking for max_fail_percentage 13273 1726853312.15473: checking to see if all hosts have failed and the running result is not ok 13273 1726853312.15473: done checking to see if all hosts have failed 13273 1726853312.15474: getting the remaining hosts for this loop 13273 1726853312.15476: done getting the remaining hosts for this loop 13273 1726853312.15480: getting the next task for host managed_node3 13273 1726853312.15486: done getting next task for host managed_node3 13273 1726853312.15490: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 13273 1726853312.15494: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853312.15506: getting variables 13273 1726853312.15508: in VariableManager get_vars() 13273 1726853312.15560: Calling all_inventory to load vars for managed_node3 13273 1726853312.15564: Calling groups_inventory to load vars for managed_node3 13273 1726853312.15567: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853312.15691: Calling all_plugins_play to load vars for managed_node3 13273 1726853312.15695: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853312.15699: Calling groups_plugins_play to load vars for managed_node3 13273 1726853312.17092: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853312.18713: done with get_vars() 13273 1726853312.18739: done getting variables 13273 1726853312.18807: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 13:28:32 -0400 (0:00:00.064) 0:00:30.077 ****** 13273 1726853312.18851: entering _queue_task() for managed_node3/debug 13273 1726853312.19322: worker is 1 (out of 1 available) 13273 1726853312.19333: exiting _queue_task() for managed_node3/debug 13273 1726853312.19347: done queuing things up, now waiting for results queue to drain 13273 1726853312.19349: waiting for pending results... 13273 1726853312.19549: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 13273 1726853312.19715: in run() - task 02083763-bbaf-5fc3-657d-00000000008f 13273 1726853312.19736: variable 'ansible_search_path' from source: unknown 13273 1726853312.19751: variable 'ansible_search_path' from source: unknown 13273 1726853312.19858: calling self._execute() 13273 1726853312.19911: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853312.19928: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853312.19946: variable 'omit' from source: magic vars 13273 1726853312.20368: variable 'ansible_distribution_major_version' from source: facts 13273 1726853312.20386: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853312.20497: variable 'network_state' from source: role '' defaults 13273 1726853312.20516: Evaluated conditional (network_state != {}): False 13273 1726853312.20522: when evaluation is False, skipping this task 13273 1726853312.20528: _execute() done 13273 1726853312.20534: dumping result to json 13273 1726853312.20539: done dumping result, returning 13273 1726853312.20620: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [02083763-bbaf-5fc3-657d-00000000008f] 13273 1726853312.20624: sending task result for task 02083763-bbaf-5fc3-657d-00000000008f 13273 1726853312.20702: done sending task result for task 02083763-bbaf-5fc3-657d-00000000008f 13273 1726853312.20705: WORKER PROCESS EXITING skipping: [managed_node3] => { "false_condition": "network_state != {}" } 13273 1726853312.20805: no more pending results, returning what we have 13273 1726853312.20809: results queue empty 13273 1726853312.20810: checking for any_errors_fatal 13273 1726853312.20819: done checking for any_errors_fatal 13273 1726853312.20819: checking for max_fail_percentage 13273 1726853312.20821: done checking for max_fail_percentage 13273 1726853312.20822: checking to see if all hosts have failed and the running result is not ok 13273 1726853312.20822: done checking to see if all hosts have failed 13273 1726853312.20823: getting the remaining hosts for this loop 13273 1726853312.20824: done getting the remaining hosts for this loop 13273 1726853312.20827: getting the next task for host managed_node3 13273 1726853312.20951: done getting next task for host managed_node3 13273 1726853312.20956: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 13273 1726853312.20959: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853312.20978: getting variables 13273 1726853312.20980: in VariableManager get_vars() 13273 1726853312.21030: Calling all_inventory to load vars for managed_node3 13273 1726853312.21033: Calling groups_inventory to load vars for managed_node3 13273 1726853312.21036: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853312.21048: Calling all_plugins_play to load vars for managed_node3 13273 1726853312.21050: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853312.21169: Calling groups_plugins_play to load vars for managed_node3 13273 1726853312.22669: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853312.24287: done with get_vars() 13273 1726853312.24315: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 13:28:32 -0400 (0:00:00.055) 0:00:30.133 ****** 13273 1726853312.24428: entering _queue_task() for managed_node3/ping 13273 1726853312.24809: worker is 1 (out of 1 available) 13273 1726853312.24823: exiting _queue_task() for managed_node3/ping 13273 1726853312.24834: done queuing things up, now waiting for results queue to drain 13273 1726853312.24835: waiting for pending results... 13273 1726853312.25193: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 13273 1726853312.25377: in run() - task 02083763-bbaf-5fc3-657d-000000000090 13273 1726853312.25381: variable 'ansible_search_path' from source: unknown 13273 1726853312.25384: variable 'ansible_search_path' from source: unknown 13273 1726853312.25389: calling self._execute() 13273 1726853312.25467: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853312.25507: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853312.25511: variable 'omit' from source: magic vars 13273 1726853312.25912: variable 'ansible_distribution_major_version' from source: facts 13273 1726853312.25930: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853312.26054: variable 'omit' from source: magic vars 13273 1726853312.26058: variable 'omit' from source: magic vars 13273 1726853312.26060: variable 'omit' from source: magic vars 13273 1726853312.26097: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13273 1726853312.26138: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13273 1726853312.26175: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13273 1726853312.26198: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853312.26214: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853312.26252: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13273 1726853312.26262: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853312.26378: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853312.26381: Set connection var ansible_connection to ssh 13273 1726853312.26400: Set connection var ansible_module_compression to ZIP_DEFLATED 13273 1726853312.26411: Set connection var ansible_shell_executable to /bin/sh 13273 1726853312.26418: Set connection var ansible_shell_type to sh 13273 1726853312.26429: Set connection var ansible_pipelining to False 13273 1726853312.26440: Set connection var ansible_timeout to 10 13273 1726853312.26474: variable 'ansible_shell_executable' from source: unknown 13273 1726853312.26488: variable 'ansible_connection' from source: unknown 13273 1726853312.26503: variable 'ansible_module_compression' from source: unknown 13273 1726853312.26511: variable 'ansible_shell_type' from source: unknown 13273 1726853312.26518: variable 'ansible_shell_executable' from source: unknown 13273 1726853312.26526: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853312.26534: variable 'ansible_pipelining' from source: unknown 13273 1726853312.26542: variable 'ansible_timeout' from source: unknown 13273 1726853312.26554: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853312.26813: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 13273 1726853312.26819: variable 'omit' from source: magic vars 13273 1726853312.26822: starting attempt loop 13273 1726853312.26823: running the handler 13273 1726853312.26825: _low_level_execute_command(): starting 13273 1726853312.26827: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13273 1726853312.28005: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853312.28116: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853312.28221: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853312.29922: stdout chunk (state=3): >>>/root <<< 13273 1726853312.30281: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853312.30284: stdout chunk (state=3): >>><<< 13273 1726853312.30286: stderr chunk (state=3): >>><<< 13273 1726853312.30291: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853312.30293: _low_level_execute_command(): starting 13273 1726853312.30296: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853312.302101-14686-258895323883449 `" && echo ansible-tmp-1726853312.302101-14686-258895323883449="` echo /root/.ansible/tmp/ansible-tmp-1726853312.302101-14686-258895323883449 `" ) && sleep 0' 13273 1726853312.31582: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853312.31793: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853312.31904: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853312.33911: stdout chunk (state=3): >>>ansible-tmp-1726853312.302101-14686-258895323883449=/root/.ansible/tmp/ansible-tmp-1726853312.302101-14686-258895323883449 <<< 13273 1726853312.34090: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853312.34112: stderr chunk (state=3): >>><<< 13273 1726853312.34123: stdout chunk (state=3): >>><<< 13273 1726853312.34191: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853312.302101-14686-258895323883449=/root/.ansible/tmp/ansible-tmp-1726853312.302101-14686-258895323883449 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853312.34500: variable 'ansible_module_compression' from source: unknown 13273 1726853312.34504: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13273pukop8ph/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 13273 1726853312.34506: variable 'ansible_facts' from source: unknown 13273 1726853312.34585: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853312.302101-14686-258895323883449/AnsiballZ_ping.py 13273 1726853312.34833: Sending initial data 13273 1726853312.34843: Sent initial data (152 bytes) 13273 1726853312.35428: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853312.35488: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853312.35552: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853312.35573: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853312.35596: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853312.35692: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853312.37391: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 debug2: Sending SSH2_FXP_REALPATH "." <<< 13273 1726853312.37457: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13273pukop8ph/tmp3kkwuo7d /root/.ansible/tmp/ansible-tmp-1726853312.302101-14686-258895323883449/AnsiballZ_ping.py <<< 13273 1726853312.37460: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853312.302101-14686-258895323883449/AnsiballZ_ping.py" <<< 13273 1726853312.37507: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13273pukop8ph/tmp3kkwuo7d" to remote "/root/.ansible/tmp/ansible-tmp-1726853312.302101-14686-258895323883449/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853312.302101-14686-258895323883449/AnsiballZ_ping.py" <<< 13273 1726853312.38820: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853312.38823: stdout chunk (state=3): >>><<< 13273 1726853312.38825: stderr chunk (state=3): >>><<< 13273 1726853312.38979: done transferring module to remote 13273 1726853312.38983: _low_level_execute_command(): starting 13273 1726853312.38985: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853312.302101-14686-258895323883449/ /root/.ansible/tmp/ansible-tmp-1726853312.302101-14686-258895323883449/AnsiballZ_ping.py && sleep 0' 13273 1726853312.40115: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13273 1726853312.40253: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853312.40315: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853312.40374: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853312.40413: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853312.40479: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853312.42550: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853312.42555: stdout chunk (state=3): >>><<< 13273 1726853312.42558: stderr chunk (state=3): >>><<< 13273 1726853312.42664: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853312.42676: _low_level_execute_command(): starting 13273 1726853312.42678: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853312.302101-14686-258895323883449/AnsiballZ_ping.py && sleep 0' 13273 1726853312.43393: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13273 1726853312.43434: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853312.43446: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13273 1726853312.43457: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address <<< 13273 1726853312.43485: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853312.43559: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853312.43602: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853312.43605: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853312.43859: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853312.59116: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 13273 1726853312.60522: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 13273 1726853312.60534: stdout chunk (state=3): >>><<< 13273 1726853312.60677: stderr chunk (state=3): >>><<< 13273 1726853312.60682: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 13273 1726853312.60686: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853312.302101-14686-258895323883449/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13273 1726853312.60688: _low_level_execute_command(): starting 13273 1726853312.60690: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853312.302101-14686-258895323883449/ > /dev/null 2>&1 && sleep 0' 13273 1726853312.61853: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13273 1726853312.62069: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853312.62138: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853312.62158: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853312.62360: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853312.64291: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853312.64316: stderr chunk (state=3): >>><<< 13273 1726853312.64393: stdout chunk (state=3): >>><<< 13273 1726853312.64408: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853312.64418: handler run complete 13273 1726853312.64434: attempt loop complete, returning result 13273 1726853312.64440: _execute() done 13273 1726853312.64446: dumping result to json 13273 1726853312.64451: done dumping result, returning 13273 1726853312.64604: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [02083763-bbaf-5fc3-657d-000000000090] 13273 1726853312.64607: sending task result for task 02083763-bbaf-5fc3-657d-000000000090 13273 1726853312.64684: done sending task result for task 02083763-bbaf-5fc3-657d-000000000090 13273 1726853312.64688: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "ping": "pong" } 13273 1726853312.64779: no more pending results, returning what we have 13273 1726853312.64783: results queue empty 13273 1726853312.64784: checking for any_errors_fatal 13273 1726853312.64792: done checking for any_errors_fatal 13273 1726853312.64793: checking for max_fail_percentage 13273 1726853312.64795: done checking for max_fail_percentage 13273 1726853312.64796: checking to see if all hosts have failed and the running result is not ok 13273 1726853312.64796: done checking to see if all hosts have failed 13273 1726853312.64797: getting the remaining hosts for this loop 13273 1726853312.64799: done getting the remaining hosts for this loop 13273 1726853312.64802: getting the next task for host managed_node3 13273 1726853312.64812: done getting next task for host managed_node3 13273 1726853312.64815: ^ task is: TASK: meta (role_complete) 13273 1726853312.64818: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853312.64831: getting variables 13273 1726853312.64833: in VariableManager get_vars() 13273 1726853312.65198: Calling all_inventory to load vars for managed_node3 13273 1726853312.65201: Calling groups_inventory to load vars for managed_node3 13273 1726853312.65204: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853312.65215: Calling all_plugins_play to load vars for managed_node3 13273 1726853312.65218: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853312.65221: Calling groups_plugins_play to load vars for managed_node3 13273 1726853312.68005: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853312.71173: done with get_vars() 13273 1726853312.71205: done getting variables 13273 1726853312.71497: done queuing things up, now waiting for results queue to drain 13273 1726853312.71499: results queue empty 13273 1726853312.71500: checking for any_errors_fatal 13273 1726853312.71503: done checking for any_errors_fatal 13273 1726853312.71504: checking for max_fail_percentage 13273 1726853312.71505: done checking for max_fail_percentage 13273 1726853312.71506: checking to see if all hosts have failed and the running result is not ok 13273 1726853312.71506: done checking to see if all hosts have failed 13273 1726853312.71507: getting the remaining hosts for this loop 13273 1726853312.71508: done getting the remaining hosts for this loop 13273 1726853312.71511: getting the next task for host managed_node3 13273 1726853312.71515: done getting next task for host managed_node3 13273 1726853312.71518: ^ task is: TASK: From the active connection, get the port1 profile "{{ port1_profile }}" 13273 1726853312.71519: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853312.71522: getting variables 13273 1726853312.71523: in VariableManager get_vars() 13273 1726853312.71544: Calling all_inventory to load vars for managed_node3 13273 1726853312.71547: Calling groups_inventory to load vars for managed_node3 13273 1726853312.71549: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853312.71554: Calling all_plugins_play to load vars for managed_node3 13273 1726853312.71556: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853312.71559: Calling groups_plugins_play to load vars for managed_node3 13273 1726853312.74001: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853312.77160: done with get_vars() 13273 1726853312.77189: done getting variables 13273 1726853312.77233: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 13273 1726853312.77562: variable 'port1_profile' from source: play vars TASK [From the active connection, get the port1 profile "bond0.0"] ************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:104 Friday 20 September 2024 13:28:32 -0400 (0:00:00.531) 0:00:30.665 ****** 13273 1726853312.77600: entering _queue_task() for managed_node3/command 13273 1726853312.78101: worker is 1 (out of 1 available) 13273 1726853312.78114: exiting _queue_task() for managed_node3/command 13273 1726853312.78128: done queuing things up, now waiting for results queue to drain 13273 1726853312.78134: waiting for pending results... 13273 1726853312.78487: running TaskExecutor() for managed_node3/TASK: From the active connection, get the port1 profile "bond0.0" 13273 1726853312.78492: in run() - task 02083763-bbaf-5fc3-657d-0000000000c0 13273 1726853312.78495: variable 'ansible_search_path' from source: unknown 13273 1726853312.78519: calling self._execute() 13273 1726853312.78622: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853312.78635: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853312.78649: variable 'omit' from source: magic vars 13273 1726853312.79025: variable 'ansible_distribution_major_version' from source: facts 13273 1726853312.79043: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853312.79145: variable 'network_provider' from source: set_fact 13273 1726853312.79155: Evaluated conditional (network_provider == "nm"): True 13273 1726853312.79165: variable 'omit' from source: magic vars 13273 1726853312.79195: variable 'omit' from source: magic vars 13273 1726853312.79393: variable 'port1_profile' from source: play vars 13273 1726853312.79445: variable 'omit' from source: magic vars 13273 1726853312.79562: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13273 1726853312.79685: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13273 1726853312.79824: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13273 1726853312.79828: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853312.79841: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853312.79891: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13273 1726853312.79902: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853312.79976: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853312.80018: Set connection var ansible_connection to ssh 13273 1726853312.80033: Set connection var ansible_module_compression to ZIP_DEFLATED 13273 1726853312.80045: Set connection var ansible_shell_executable to /bin/sh 13273 1726853312.80054: Set connection var ansible_shell_type to sh 13273 1726853312.80064: Set connection var ansible_pipelining to False 13273 1726853312.80076: Set connection var ansible_timeout to 10 13273 1726853312.80125: variable 'ansible_shell_executable' from source: unknown 13273 1726853312.80134: variable 'ansible_connection' from source: unknown 13273 1726853312.80141: variable 'ansible_module_compression' from source: unknown 13273 1726853312.80150: variable 'ansible_shell_type' from source: unknown 13273 1726853312.80158: variable 'ansible_shell_executable' from source: unknown 13273 1726853312.80165: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853312.80176: variable 'ansible_pipelining' from source: unknown 13273 1726853312.80184: variable 'ansible_timeout' from source: unknown 13273 1726853312.80217: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853312.80354: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13273 1726853312.80373: variable 'omit' from source: magic vars 13273 1726853312.80385: starting attempt loop 13273 1726853312.80435: running the handler 13273 1726853312.80438: _low_level_execute_command(): starting 13273 1726853312.80441: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13273 1726853312.81487: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853312.81595: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853312.81862: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853312.81881: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853312.81902: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853312.81999: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853312.83734: stdout chunk (state=3): >>>/root <<< 13273 1726853312.83878: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853312.83883: stdout chunk (state=3): >>><<< 13273 1726853312.84036: stderr chunk (state=3): >>><<< 13273 1726853312.84042: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853312.84044: _low_level_execute_command(): starting 13273 1726853312.84047: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853312.839598-14709-255366987224492 `" && echo ansible-tmp-1726853312.839598-14709-255366987224492="` echo /root/.ansible/tmp/ansible-tmp-1726853312.839598-14709-255366987224492 `" ) && sleep 0' 13273 1726853312.85046: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853312.85087: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853312.85191: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853312.87425: stdout chunk (state=3): >>>ansible-tmp-1726853312.839598-14709-255366987224492=/root/.ansible/tmp/ansible-tmp-1726853312.839598-14709-255366987224492 <<< 13273 1726853312.87439: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853312.87449: stdout chunk (state=3): >>><<< 13273 1726853312.87460: stderr chunk (state=3): >>><<< 13273 1726853312.87493: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853312.839598-14709-255366987224492=/root/.ansible/tmp/ansible-tmp-1726853312.839598-14709-255366987224492 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853312.87533: variable 'ansible_module_compression' from source: unknown 13273 1726853312.87676: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13273pukop8ph/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 13273 1726853312.87679: variable 'ansible_facts' from source: unknown 13273 1726853312.87749: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853312.839598-14709-255366987224492/AnsiballZ_command.py 13273 1726853312.87929: Sending initial data 13273 1726853312.87938: Sent initial data (155 bytes) 13273 1726853312.89142: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853312.89216: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853312.89265: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853312.89378: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853312.91056: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13273 1726853312.91153: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13273 1726853312.91278: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13273pukop8ph/tmpfygweovr /root/.ansible/tmp/ansible-tmp-1726853312.839598-14709-255366987224492/AnsiballZ_command.py <<< 13273 1726853312.91293: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853312.839598-14709-255366987224492/AnsiballZ_command.py" <<< 13273 1726853312.91511: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13273pukop8ph/tmpfygweovr" to remote "/root/.ansible/tmp/ansible-tmp-1726853312.839598-14709-255366987224492/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853312.839598-14709-255366987224492/AnsiballZ_command.py" <<< 13273 1726853312.92584: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853312.92665: stderr chunk (state=3): >>><<< 13273 1726853312.92682: stdout chunk (state=3): >>><<< 13273 1726853312.92749: done transferring module to remote 13273 1726853312.92753: _low_level_execute_command(): starting 13273 1726853312.92755: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853312.839598-14709-255366987224492/ /root/.ansible/tmp/ansible-tmp-1726853312.839598-14709-255366987224492/AnsiballZ_command.py && sleep 0' 13273 1726853312.93362: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13273 1726853312.93406: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853312.93418: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853312.93514: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853312.93526: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853312.93547: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853312.93636: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853312.95688: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853312.95692: stdout chunk (state=3): >>><<< 13273 1726853312.95695: stderr chunk (state=3): >>><<< 13273 1726853312.95778: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853312.95792: _low_level_execute_command(): starting 13273 1726853312.95798: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853312.839598-14709-255366987224492/AnsiballZ_command.py && sleep 0' 13273 1726853312.96383: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13273 1726853312.96396: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853312.96410: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853312.96427: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13273 1726853312.96556: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 13273 1726853312.96561: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853312.96577: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853312.96593: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853312.96694: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853313.14314: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "c", "show", "--active", "bond0.0"], "start": "2024-09-20 13:28:33.119969", "end": "2024-09-20 13:28:33.141778", "delta": "0:00:00.021809", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli c show --active bond0.0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 13273 1726853313.16179: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 13273 1726853313.16183: stdout chunk (state=3): >>><<< 13273 1726853313.16185: stderr chunk (state=3): >>><<< 13273 1726853313.16188: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "c", "show", "--active", "bond0.0"], "start": "2024-09-20 13:28:33.119969", "end": "2024-09-20 13:28:33.141778", "delta": "0:00:00.021809", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli c show --active bond0.0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 13273 1726853313.16191: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli c show --active bond0.0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853312.839598-14709-255366987224492/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13273 1726853313.16193: _low_level_execute_command(): starting 13273 1726853313.16195: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853312.839598-14709-255366987224492/ > /dev/null 2>&1 && sleep 0' 13273 1726853313.16720: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853313.16735: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853313.16757: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853313.16801: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853313.16812: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853313.16816: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853313.16890: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853313.18814: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853313.18818: stdout chunk (state=3): >>><<< 13273 1726853313.18820: stderr chunk (state=3): >>><<< 13273 1726853313.18977: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853313.18985: handler run complete 13273 1726853313.18988: Evaluated conditional (False): False 13273 1726853313.18990: attempt loop complete, returning result 13273 1726853313.18993: _execute() done 13273 1726853313.18995: dumping result to json 13273 1726853313.18997: done dumping result, returning 13273 1726853313.18999: done running TaskExecutor() for managed_node3/TASK: From the active connection, get the port1 profile "bond0.0" [02083763-bbaf-5fc3-657d-0000000000c0] 13273 1726853313.19001: sending task result for task 02083763-bbaf-5fc3-657d-0000000000c0 ok: [managed_node3] => { "changed": false, "cmd": [ "nmcli", "c", "show", "--active", "bond0.0" ], "delta": "0:00:00.021809", "end": "2024-09-20 13:28:33.141778", "rc": 0, "start": "2024-09-20 13:28:33.119969" } 13273 1726853313.19180: no more pending results, returning what we have 13273 1726853313.19191: results queue empty 13273 1726853313.19192: checking for any_errors_fatal 13273 1726853313.19196: done checking for any_errors_fatal 13273 1726853313.19212: checking for max_fail_percentage 13273 1726853313.19215: done checking for max_fail_percentage 13273 1726853313.19215: checking to see if all hosts have failed and the running result is not ok 13273 1726853313.19216: done checking to see if all hosts have failed 13273 1726853313.19217: getting the remaining hosts for this loop 13273 1726853313.19218: done getting the remaining hosts for this loop 13273 1726853313.19221: getting the next task for host managed_node3 13273 1726853313.19227: done getting next task for host managed_node3 13273 1726853313.19229: ^ task is: TASK: From the active connection, get the port2 profile "{{ port2_profile }}" 13273 1726853313.19231: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853313.19235: getting variables 13273 1726853313.19236: in VariableManager get_vars() 13273 1726853313.19297: Calling all_inventory to load vars for managed_node3 13273 1726853313.19300: Calling groups_inventory to load vars for managed_node3 13273 1726853313.19303: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853313.19309: done sending task result for task 02083763-bbaf-5fc3-657d-0000000000c0 13273 1726853313.19311: WORKER PROCESS EXITING 13273 1726853313.19320: Calling all_plugins_play to load vars for managed_node3 13273 1726853313.19322: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853313.19325: Calling groups_plugins_play to load vars for managed_node3 13273 1726853313.20123: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853313.21119: done with get_vars() 13273 1726853313.21135: done getting variables 13273 1726853313.21183: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 13273 1726853313.21270: variable 'port2_profile' from source: play vars TASK [From the active connection, get the port2 profile "bond0.1"] ************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:111 Friday 20 September 2024 13:28:33 -0400 (0:00:00.436) 0:00:31.102 ****** 13273 1726853313.21293: entering _queue_task() for managed_node3/command 13273 1726853313.21542: worker is 1 (out of 1 available) 13273 1726853313.21559: exiting _queue_task() for managed_node3/command 13273 1726853313.21572: done queuing things up, now waiting for results queue to drain 13273 1726853313.21574: waiting for pending results... 13273 1726853313.21749: running TaskExecutor() for managed_node3/TASK: From the active connection, get the port2 profile "bond0.1" 13273 1726853313.21817: in run() - task 02083763-bbaf-5fc3-657d-0000000000c1 13273 1726853313.21830: variable 'ansible_search_path' from source: unknown 13273 1726853313.21860: calling self._execute() 13273 1726853313.21940: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853313.21947: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853313.21954: variable 'omit' from source: magic vars 13273 1726853313.22238: variable 'ansible_distribution_major_version' from source: facts 13273 1726853313.22247: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853313.22322: variable 'network_provider' from source: set_fact 13273 1726853313.22326: Evaluated conditional (network_provider == "nm"): True 13273 1726853313.22333: variable 'omit' from source: magic vars 13273 1726853313.22354: variable 'omit' from source: magic vars 13273 1726853313.22418: variable 'port2_profile' from source: play vars 13273 1726853313.22431: variable 'omit' from source: magic vars 13273 1726853313.22465: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13273 1726853313.22494: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13273 1726853313.22509: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13273 1726853313.22522: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853313.22533: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853313.22558: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13273 1726853313.22561: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853313.22564: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853313.22632: Set connection var ansible_connection to ssh 13273 1726853313.22640: Set connection var ansible_module_compression to ZIP_DEFLATED 13273 1726853313.22647: Set connection var ansible_shell_executable to /bin/sh 13273 1726853313.22650: Set connection var ansible_shell_type to sh 13273 1726853313.22655: Set connection var ansible_pipelining to False 13273 1726853313.22660: Set connection var ansible_timeout to 10 13273 1726853313.22684: variable 'ansible_shell_executable' from source: unknown 13273 1726853313.22687: variable 'ansible_connection' from source: unknown 13273 1726853313.22689: variable 'ansible_module_compression' from source: unknown 13273 1726853313.22691: variable 'ansible_shell_type' from source: unknown 13273 1726853313.22693: variable 'ansible_shell_executable' from source: unknown 13273 1726853313.22696: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853313.22700: variable 'ansible_pipelining' from source: unknown 13273 1726853313.22708: variable 'ansible_timeout' from source: unknown 13273 1726853313.22711: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853313.22817: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13273 1726853313.22826: variable 'omit' from source: magic vars 13273 1726853313.22854: starting attempt loop 13273 1726853313.22857: running the handler 13273 1726853313.22862: _low_level_execute_command(): starting 13273 1726853313.22962: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13273 1726853313.23577: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13273 1726853313.23623: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853313.23627: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853313.23629: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13273 1726853313.23632: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 13273 1726853313.23635: stderr chunk (state=3): >>>debug2: match not found <<< 13273 1726853313.23637: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853313.23653: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13273 1726853313.23662: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address <<< 13273 1726853313.23679: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13273 1726853313.23685: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853313.23695: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853313.23732: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13273 1726853313.23735: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 13273 1726853313.23737: stderr chunk (state=3): >>>debug2: match found <<< 13273 1726853313.23739: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853313.23802: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853313.23824: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853313.23911: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853313.25615: stdout chunk (state=3): >>>/root <<< 13273 1726853313.25714: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853313.25748: stderr chunk (state=3): >>><<< 13273 1726853313.25750: stdout chunk (state=3): >>><<< 13273 1726853313.25768: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853313.25792: _low_level_execute_command(): starting 13273 1726853313.25798: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853313.2578032-14732-221877301021847 `" && echo ansible-tmp-1726853313.2578032-14732-221877301021847="` echo /root/.ansible/tmp/ansible-tmp-1726853313.2578032-14732-221877301021847 `" ) && sleep 0' 13273 1726853313.26488: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853313.26675: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853313.28583: stdout chunk (state=3): >>>ansible-tmp-1726853313.2578032-14732-221877301021847=/root/.ansible/tmp/ansible-tmp-1726853313.2578032-14732-221877301021847 <<< 13273 1726853313.28778: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853313.28782: stdout chunk (state=3): >>><<< 13273 1726853313.28786: stderr chunk (state=3): >>><<< 13273 1726853313.28789: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853313.2578032-14732-221877301021847=/root/.ansible/tmp/ansible-tmp-1726853313.2578032-14732-221877301021847 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853313.28821: variable 'ansible_module_compression' from source: unknown 13273 1726853313.28883: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13273pukop8ph/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 13273 1726853313.28919: variable 'ansible_facts' from source: unknown 13273 1726853313.29024: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853313.2578032-14732-221877301021847/AnsiballZ_command.py 13273 1726853313.29215: Sending initial data 13273 1726853313.29221: Sent initial data (156 bytes) 13273 1726853313.29801: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13273 1726853313.29857: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853313.29860: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853313.29863: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13273 1726853313.29865: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 13273 1726853313.29869: stderr chunk (state=3): >>>debug2: match not found <<< 13273 1726853313.29874: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853313.30081: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853313.30086: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853313.30088: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853313.30090: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853313.31714: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13273 1726853313.31764: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13273 1726853313.31841: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13273pukop8ph/tmp9n_u4cao /root/.ansible/tmp/ansible-tmp-1726853313.2578032-14732-221877301021847/AnsiballZ_command.py <<< 13273 1726853313.31870: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853313.2578032-14732-221877301021847/AnsiballZ_command.py" <<< 13273 1726853313.31937: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13273pukop8ph/tmp9n_u4cao" to remote "/root/.ansible/tmp/ansible-tmp-1726853313.2578032-14732-221877301021847/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853313.2578032-14732-221877301021847/AnsiballZ_command.py" <<< 13273 1726853313.32783: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853313.32949: stderr chunk (state=3): >>><<< 13273 1726853313.32953: stdout chunk (state=3): >>><<< 13273 1726853313.32955: done transferring module to remote 13273 1726853313.32958: _low_level_execute_command(): starting 13273 1726853313.32961: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853313.2578032-14732-221877301021847/ /root/.ansible/tmp/ansible-tmp-1726853313.2578032-14732-221877301021847/AnsiballZ_command.py && sleep 0' 13273 1726853313.33631: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853313.33709: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853313.33738: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853313.33789: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853313.33988: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853313.35893: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853313.35897: stdout chunk (state=3): >>><<< 13273 1726853313.35899: stderr chunk (state=3): >>><<< 13273 1726853313.35903: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853313.35905: _low_level_execute_command(): starting 13273 1726853313.35908: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853313.2578032-14732-221877301021847/AnsiballZ_command.py && sleep 0' 13273 1726853313.37124: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13273 1726853313.37479: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853313.37487: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853313.37586: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853313.54892: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "c", "show", "--active", "bond0.1"], "start": "2024-09-20 13:28:33.529605", "end": "2024-09-20 13:28:33.547590", "delta": "0:00:00.017985", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli c show --active bond0.1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 13273 1726853313.56681: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 13273 1726853313.56685: stdout chunk (state=3): >>><<< 13273 1726853313.56688: stderr chunk (state=3): >>><<< 13273 1726853313.56691: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "c", "show", "--active", "bond0.1"], "start": "2024-09-20 13:28:33.529605", "end": "2024-09-20 13:28:33.547590", "delta": "0:00:00.017985", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli c show --active bond0.1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 13273 1726853313.56694: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli c show --active bond0.1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853313.2578032-14732-221877301021847/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13273 1726853313.56696: _low_level_execute_command(): starting 13273 1726853313.56698: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853313.2578032-14732-221877301021847/ > /dev/null 2>&1 && sleep 0' 13273 1726853313.57379: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853313.57397: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853313.57415: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853313.57498: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853313.59566: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853313.59583: stdout chunk (state=3): >>><<< 13273 1726853313.59601: stderr chunk (state=3): >>><<< 13273 1726853313.59634: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853313.59647: handler run complete 13273 1726853313.59692: Evaluated conditional (False): False 13273 1726853313.59715: attempt loop complete, returning result 13273 1726853313.59794: _execute() done 13273 1726853313.59797: dumping result to json 13273 1726853313.59805: done dumping result, returning 13273 1726853313.59810: done running TaskExecutor() for managed_node3/TASK: From the active connection, get the port2 profile "bond0.1" [02083763-bbaf-5fc3-657d-0000000000c1] 13273 1726853313.59812: sending task result for task 02083763-bbaf-5fc3-657d-0000000000c1 13273 1726853313.59890: done sending task result for task 02083763-bbaf-5fc3-657d-0000000000c1 13273 1726853313.59893: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": [ "nmcli", "c", "show", "--active", "bond0.1" ], "delta": "0:00:00.017985", "end": "2024-09-20 13:28:33.547590", "rc": 0, "start": "2024-09-20 13:28:33.529605" } 13273 1726853313.59983: no more pending results, returning what we have 13273 1726853313.59988: results queue empty 13273 1726853313.59989: checking for any_errors_fatal 13273 1726853313.60002: done checking for any_errors_fatal 13273 1726853313.60003: checking for max_fail_percentage 13273 1726853313.60005: done checking for max_fail_percentage 13273 1726853313.60006: checking to see if all hosts have failed and the running result is not ok 13273 1726853313.60007: done checking to see if all hosts have failed 13273 1726853313.60008: getting the remaining hosts for this loop 13273 1726853313.60009: done getting the remaining hosts for this loop 13273 1726853313.60013: getting the next task for host managed_node3 13273 1726853313.60020: done getting next task for host managed_node3 13273 1726853313.60022: ^ task is: TASK: Assert that the port1 profile is not activated 13273 1726853313.60025: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853313.60029: getting variables 13273 1726853313.60031: in VariableManager get_vars() 13273 1726853313.60199: Calling all_inventory to load vars for managed_node3 13273 1726853313.60203: Calling groups_inventory to load vars for managed_node3 13273 1726853313.60205: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853313.60217: Calling all_plugins_play to load vars for managed_node3 13273 1726853313.60222: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853313.60225: Calling groups_plugins_play to load vars for managed_node3 13273 1726853313.61838: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853313.63460: done with get_vars() 13273 1726853313.63489: done getting variables 13273 1726853313.63557: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Assert that the port1 profile is not activated] ************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:118 Friday 20 September 2024 13:28:33 -0400 (0:00:00.423) 0:00:31.525 ****** 13273 1726853313.63598: entering _queue_task() for managed_node3/assert 13273 1726853313.63967: worker is 1 (out of 1 available) 13273 1726853313.63992: exiting _queue_task() for managed_node3/assert 13273 1726853313.64006: done queuing things up, now waiting for results queue to drain 13273 1726853313.64007: waiting for pending results... 13273 1726853313.64360: running TaskExecutor() for managed_node3/TASK: Assert that the port1 profile is not activated 13273 1726853313.64366: in run() - task 02083763-bbaf-5fc3-657d-0000000000c2 13273 1726853313.64370: variable 'ansible_search_path' from source: unknown 13273 1726853313.64464: calling self._execute() 13273 1726853313.64493: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853313.64499: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853313.64509: variable 'omit' from source: magic vars 13273 1726853313.64977: variable 'ansible_distribution_major_version' from source: facts 13273 1726853313.64983: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853313.65102: variable 'network_provider' from source: set_fact 13273 1726853313.65106: Evaluated conditional (network_provider == "nm"): True 13273 1726853313.65109: variable 'omit' from source: magic vars 13273 1726853313.65111: variable 'omit' from source: magic vars 13273 1726853313.65114: variable 'port1_profile' from source: play vars 13273 1726853313.65124: variable 'omit' from source: magic vars 13273 1726853313.65211: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13273 1726853313.65215: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13273 1726853313.65218: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13273 1726853313.65236: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853313.65252: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853313.65278: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13273 1726853313.65282: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853313.65284: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853313.65373: Set connection var ansible_connection to ssh 13273 1726853313.65383: Set connection var ansible_module_compression to ZIP_DEFLATED 13273 1726853313.65428: Set connection var ansible_shell_executable to /bin/sh 13273 1726853313.65431: Set connection var ansible_shell_type to sh 13273 1726853313.65434: Set connection var ansible_pipelining to False 13273 1726853313.65436: Set connection var ansible_timeout to 10 13273 1726853313.65439: variable 'ansible_shell_executable' from source: unknown 13273 1726853313.65441: variable 'ansible_connection' from source: unknown 13273 1726853313.65443: variable 'ansible_module_compression' from source: unknown 13273 1726853313.65445: variable 'ansible_shell_type' from source: unknown 13273 1726853313.65447: variable 'ansible_shell_executable' from source: unknown 13273 1726853313.65448: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853313.65450: variable 'ansible_pipelining' from source: unknown 13273 1726853313.65452: variable 'ansible_timeout' from source: unknown 13273 1726853313.65455: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853313.65578: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13273 1726853313.65586: variable 'omit' from source: magic vars 13273 1726853313.65592: starting attempt loop 13273 1726853313.65595: running the handler 13273 1726853313.65753: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13273 1726853313.67868: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13273 1726853313.67938: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13273 1726853313.67974: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13273 1726853313.68008: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13273 1726853313.68031: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13273 1726853313.68102: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853313.68129: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853313.68154: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853313.68193: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853313.68244: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853313.68305: variable 'active_port1_profile' from source: set_fact 13273 1726853313.68325: Evaluated conditional (active_port1_profile.stdout | length == 0): True 13273 1726853313.68330: handler run complete 13273 1726853313.68344: attempt loop complete, returning result 13273 1726853313.68352: _execute() done 13273 1726853313.68355: dumping result to json 13273 1726853313.68357: done dumping result, returning 13273 1726853313.68386: done running TaskExecutor() for managed_node3/TASK: Assert that the port1 profile is not activated [02083763-bbaf-5fc3-657d-0000000000c2] 13273 1726853313.68390: sending task result for task 02083763-bbaf-5fc3-657d-0000000000c2 13273 1726853313.68451: done sending task result for task 02083763-bbaf-5fc3-657d-0000000000c2 13273 1726853313.68454: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 13273 1726853313.68506: no more pending results, returning what we have 13273 1726853313.68509: results queue empty 13273 1726853313.68510: checking for any_errors_fatal 13273 1726853313.68517: done checking for any_errors_fatal 13273 1726853313.68517: checking for max_fail_percentage 13273 1726853313.68519: done checking for max_fail_percentage 13273 1726853313.68520: checking to see if all hosts have failed and the running result is not ok 13273 1726853313.68520: done checking to see if all hosts have failed 13273 1726853313.68521: getting the remaining hosts for this loop 13273 1726853313.68523: done getting the remaining hosts for this loop 13273 1726853313.68526: getting the next task for host managed_node3 13273 1726853313.68531: done getting next task for host managed_node3 13273 1726853313.68533: ^ task is: TASK: Assert that the port2 profile is not activated 13273 1726853313.68535: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853313.68538: getting variables 13273 1726853313.68546: in VariableManager get_vars() 13273 1726853313.68601: Calling all_inventory to load vars for managed_node3 13273 1726853313.68604: Calling groups_inventory to load vars for managed_node3 13273 1726853313.68606: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853313.68615: Calling all_plugins_play to load vars for managed_node3 13273 1726853313.68618: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853313.68621: Calling groups_plugins_play to load vars for managed_node3 13273 1726853313.70155: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853313.71647: done with get_vars() 13273 1726853313.71668: done getting variables 13273 1726853313.71727: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Assert that the port2 profile is not activated] ************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:125 Friday 20 September 2024 13:28:33 -0400 (0:00:00.081) 0:00:31.606 ****** 13273 1726853313.71757: entering _queue_task() for managed_node3/assert 13273 1726853313.72066: worker is 1 (out of 1 available) 13273 1726853313.72080: exiting _queue_task() for managed_node3/assert 13273 1726853313.72093: done queuing things up, now waiting for results queue to drain 13273 1726853313.72094: waiting for pending results... 13273 1726853313.72461: running TaskExecutor() for managed_node3/TASK: Assert that the port2 profile is not activated 13273 1726853313.72466: in run() - task 02083763-bbaf-5fc3-657d-0000000000c3 13273 1726853313.72470: variable 'ansible_search_path' from source: unknown 13273 1726853313.72498: calling self._execute() 13273 1726853313.72603: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853313.72668: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853313.72674: variable 'omit' from source: magic vars 13273 1726853313.72996: variable 'ansible_distribution_major_version' from source: facts 13273 1726853313.73007: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853313.73124: variable 'network_provider' from source: set_fact 13273 1726853313.73135: Evaluated conditional (network_provider == "nm"): True 13273 1726853313.73142: variable 'omit' from source: magic vars 13273 1726853313.73165: variable 'omit' from source: magic vars 13273 1726853313.73265: variable 'port2_profile' from source: play vars 13273 1726853313.73320: variable 'omit' from source: magic vars 13273 1726853313.73324: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13273 1726853313.73362: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13273 1726853313.73381: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13273 1726853313.73398: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853313.73409: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853313.73536: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13273 1726853313.73540: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853313.73542: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853313.73544: Set connection var ansible_connection to ssh 13273 1726853313.73552: Set connection var ansible_module_compression to ZIP_DEFLATED 13273 1726853313.73558: Set connection var ansible_shell_executable to /bin/sh 13273 1726853313.73560: Set connection var ansible_shell_type to sh 13273 1726853313.73572: Set connection var ansible_pipelining to False 13273 1726853313.73578: Set connection var ansible_timeout to 10 13273 1726853313.73603: variable 'ansible_shell_executable' from source: unknown 13273 1726853313.73606: variable 'ansible_connection' from source: unknown 13273 1726853313.73609: variable 'ansible_module_compression' from source: unknown 13273 1726853313.73611: variable 'ansible_shell_type' from source: unknown 13273 1726853313.73613: variable 'ansible_shell_executable' from source: unknown 13273 1726853313.73615: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853313.73621: variable 'ansible_pipelining' from source: unknown 13273 1726853313.73623: variable 'ansible_timeout' from source: unknown 13273 1726853313.73627: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853313.73764: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13273 1726853313.73775: variable 'omit' from source: magic vars 13273 1726853313.73785: starting attempt loop 13273 1726853313.73788: running the handler 13273 1726853313.73951: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13273 1726853313.76976: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13273 1726853313.77181: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13273 1726853313.77242: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13273 1726853313.77315: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13273 1726853313.77351: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13273 1726853313.77443: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853313.77484: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853313.77514: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853313.77562: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853313.77582: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853313.77689: variable 'active_port2_profile' from source: set_fact 13273 1726853313.77713: Evaluated conditional (active_port2_profile.stdout | length == 0): True 13273 1726853313.77765: handler run complete 13273 1726853313.77769: attempt loop complete, returning result 13273 1726853313.77772: _execute() done 13273 1726853313.77775: dumping result to json 13273 1726853313.77777: done dumping result, returning 13273 1726853313.77779: done running TaskExecutor() for managed_node3/TASK: Assert that the port2 profile is not activated [02083763-bbaf-5fc3-657d-0000000000c3] 13273 1726853313.77781: sending task result for task 02083763-bbaf-5fc3-657d-0000000000c3 13273 1726853313.77984: done sending task result for task 02083763-bbaf-5fc3-657d-0000000000c3 13273 1726853313.77987: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 13273 1726853313.78036: no more pending results, returning what we have 13273 1726853313.78040: results queue empty 13273 1726853313.78041: checking for any_errors_fatal 13273 1726853313.78047: done checking for any_errors_fatal 13273 1726853313.78048: checking for max_fail_percentage 13273 1726853313.78050: done checking for max_fail_percentage 13273 1726853313.78050: checking to see if all hosts have failed and the running result is not ok 13273 1726853313.78051: done checking to see if all hosts have failed 13273 1726853313.78052: getting the remaining hosts for this loop 13273 1726853313.78054: done getting the remaining hosts for this loop 13273 1726853313.78058: getting the next task for host managed_node3 13273 1726853313.78064: done getting next task for host managed_node3 13273 1726853313.78067: ^ task is: TASK: Get the port1 device state 13273 1726853313.78069: ^ state is: HOST STATE: block=2, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853313.78076: getting variables 13273 1726853313.78275: in VariableManager get_vars() 13273 1726853313.78323: Calling all_inventory to load vars for managed_node3 13273 1726853313.78326: Calling groups_inventory to load vars for managed_node3 13273 1726853313.78328: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853313.78337: Calling all_plugins_play to load vars for managed_node3 13273 1726853313.78340: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853313.78343: Calling groups_plugins_play to load vars for managed_node3 13273 1726853313.80457: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853313.83624: done with get_vars() 13273 1726853313.83653: done getting variables 13273 1726853313.83912: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get the port1 device state] ********************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:132 Friday 20 September 2024 13:28:33 -0400 (0:00:00.121) 0:00:31.728 ****** 13273 1726853313.83943: entering _queue_task() for managed_node3/command 13273 1726853313.84780: worker is 1 (out of 1 available) 13273 1726853313.84981: exiting _queue_task() for managed_node3/command 13273 1726853313.84993: done queuing things up, now waiting for results queue to drain 13273 1726853313.84994: waiting for pending results... 13273 1726853313.85451: running TaskExecutor() for managed_node3/TASK: Get the port1 device state 13273 1726853313.85490: in run() - task 02083763-bbaf-5fc3-657d-0000000000c4 13273 1726853313.85506: variable 'ansible_search_path' from source: unknown 13273 1726853313.85547: calling self._execute() 13273 1726853313.85828: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853313.85832: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853313.85836: variable 'omit' from source: magic vars 13273 1726853313.86689: variable 'ansible_distribution_major_version' from source: facts 13273 1726853313.86702: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853313.86944: variable 'network_provider' from source: set_fact 13273 1726853313.86951: Evaluated conditional (network_provider == "initscripts"): False 13273 1726853313.86954: when evaluation is False, skipping this task 13273 1726853313.86957: _execute() done 13273 1726853313.86960: dumping result to json 13273 1726853313.86962: done dumping result, returning 13273 1726853313.87003: done running TaskExecutor() for managed_node3/TASK: Get the port1 device state [02083763-bbaf-5fc3-657d-0000000000c4] 13273 1726853313.87007: sending task result for task 02083763-bbaf-5fc3-657d-0000000000c4 13273 1726853313.87077: done sending task result for task 02083763-bbaf-5fc3-657d-0000000000c4 13273 1726853313.87080: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 13273 1726853313.87157: no more pending results, returning what we have 13273 1726853313.87161: results queue empty 13273 1726853313.87162: checking for any_errors_fatal 13273 1726853313.87168: done checking for any_errors_fatal 13273 1726853313.87169: checking for max_fail_percentage 13273 1726853313.87172: done checking for max_fail_percentage 13273 1726853313.87173: checking to see if all hosts have failed and the running result is not ok 13273 1726853313.87173: done checking to see if all hosts have failed 13273 1726853313.87174: getting the remaining hosts for this loop 13273 1726853313.87176: done getting the remaining hosts for this loop 13273 1726853313.87179: getting the next task for host managed_node3 13273 1726853313.87186: done getting next task for host managed_node3 13273 1726853313.87188: ^ task is: TASK: Get the port2 device state 13273 1726853313.87191: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853313.87196: getting variables 13273 1726853313.87197: in VariableManager get_vars() 13273 1726853313.87252: Calling all_inventory to load vars for managed_node3 13273 1726853313.87255: Calling groups_inventory to load vars for managed_node3 13273 1726853313.87258: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853313.87269: Calling all_plugins_play to load vars for managed_node3 13273 1726853313.87475: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853313.87481: Calling groups_plugins_play to load vars for managed_node3 13273 1726853313.90402: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853313.93351: done with get_vars() 13273 1726853313.93578: done getting variables 13273 1726853313.93639: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get the port2 device state] ********************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:139 Friday 20 September 2024 13:28:33 -0400 (0:00:00.097) 0:00:31.825 ****** 13273 1726853313.93667: entering _queue_task() for managed_node3/command 13273 1726853313.94385: worker is 1 (out of 1 available) 13273 1726853313.94398: exiting _queue_task() for managed_node3/command 13273 1726853313.94411: done queuing things up, now waiting for results queue to drain 13273 1726853313.94412: waiting for pending results... 13273 1726853313.95003: running TaskExecutor() for managed_node3/TASK: Get the port2 device state 13273 1726853313.95009: in run() - task 02083763-bbaf-5fc3-657d-0000000000c5 13273 1726853313.95223: variable 'ansible_search_path' from source: unknown 13273 1726853313.95261: calling self._execute() 13273 1726853313.95424: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853313.95546: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853313.95559: variable 'omit' from source: magic vars 13273 1726853313.96363: variable 'ansible_distribution_major_version' from source: facts 13273 1726853313.96378: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853313.96538: variable 'network_provider' from source: set_fact 13273 1726853313.96542: Evaluated conditional (network_provider == "initscripts"): False 13273 1726853313.96549: when evaluation is False, skipping this task 13273 1726853313.96552: _execute() done 13273 1726853313.96555: dumping result to json 13273 1726853313.96558: done dumping result, returning 13273 1726853313.96565: done running TaskExecutor() for managed_node3/TASK: Get the port2 device state [02083763-bbaf-5fc3-657d-0000000000c5] 13273 1726853313.96570: sending task result for task 02083763-bbaf-5fc3-657d-0000000000c5 13273 1726853313.96769: done sending task result for task 02083763-bbaf-5fc3-657d-0000000000c5 13273 1726853313.96773: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 13273 1726853313.96861: no more pending results, returning what we have 13273 1726853313.96864: results queue empty 13273 1726853313.96865: checking for any_errors_fatal 13273 1726853313.96869: done checking for any_errors_fatal 13273 1726853313.96870: checking for max_fail_percentage 13273 1726853313.96872: done checking for max_fail_percentage 13273 1726853313.96873: checking to see if all hosts have failed and the running result is not ok 13273 1726853313.96874: done checking to see if all hosts have failed 13273 1726853313.96875: getting the remaining hosts for this loop 13273 1726853313.96876: done getting the remaining hosts for this loop 13273 1726853313.96878: getting the next task for host managed_node3 13273 1726853313.96883: done getting next task for host managed_node3 13273 1726853313.96885: ^ task is: TASK: Assert that the port1 device is in DOWN state 13273 1726853313.96887: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853313.96890: getting variables 13273 1726853313.96891: in VariableManager get_vars() 13273 1726853313.96933: Calling all_inventory to load vars for managed_node3 13273 1726853313.96935: Calling groups_inventory to load vars for managed_node3 13273 1726853313.96937: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853313.96946: Calling all_plugins_play to load vars for managed_node3 13273 1726853313.96948: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853313.96951: Calling groups_plugins_play to load vars for managed_node3 13273 1726853313.98267: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853314.00500: done with get_vars() 13273 1726853314.00530: done getting variables 13273 1726853314.00598: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Assert that the port1 device is in DOWN state] *************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:146 Friday 20 September 2024 13:28:34 -0400 (0:00:00.069) 0:00:31.895 ****** 13273 1726853314.00634: entering _queue_task() for managed_node3/assert 13273 1726853314.01025: worker is 1 (out of 1 available) 13273 1726853314.01037: exiting _queue_task() for managed_node3/assert 13273 1726853314.01049: done queuing things up, now waiting for results queue to drain 13273 1726853314.01050: waiting for pending results... 13273 1726853314.01336: running TaskExecutor() for managed_node3/TASK: Assert that the port1 device is in DOWN state 13273 1726853314.01478: in run() - task 02083763-bbaf-5fc3-657d-0000000000c6 13273 1726853314.01482: variable 'ansible_search_path' from source: unknown 13273 1726853314.01506: calling self._execute() 13273 1726853314.01615: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853314.01627: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853314.01639: variable 'omit' from source: magic vars 13273 1726853314.02023: variable 'ansible_distribution_major_version' from source: facts 13273 1726853314.02209: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853314.02286: variable 'network_provider' from source: set_fact 13273 1726853314.02296: Evaluated conditional (network_provider == "initscripts"): False 13273 1726853314.02303: when evaluation is False, skipping this task 13273 1726853314.02309: _execute() done 13273 1726853314.02315: dumping result to json 13273 1726853314.02325: done dumping result, returning 13273 1726853314.02336: done running TaskExecutor() for managed_node3/TASK: Assert that the port1 device is in DOWN state [02083763-bbaf-5fc3-657d-0000000000c6] 13273 1726853314.02345: sending task result for task 02083763-bbaf-5fc3-657d-0000000000c6 skipping: [managed_node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 13273 1726853314.02515: no more pending results, returning what we have 13273 1726853314.02520: results queue empty 13273 1726853314.02521: checking for any_errors_fatal 13273 1726853314.02527: done checking for any_errors_fatal 13273 1726853314.02528: checking for max_fail_percentage 13273 1726853314.02531: done checking for max_fail_percentage 13273 1726853314.02531: checking to see if all hosts have failed and the running result is not ok 13273 1726853314.02532: done checking to see if all hosts have failed 13273 1726853314.02533: getting the remaining hosts for this loop 13273 1726853314.02534: done getting the remaining hosts for this loop 13273 1726853314.02538: getting the next task for host managed_node3 13273 1726853314.02545: done getting next task for host managed_node3 13273 1726853314.02548: ^ task is: TASK: Assert that the port2 device is in DOWN state 13273 1726853314.02551: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853314.02554: getting variables 13273 1726853314.02556: in VariableManager get_vars() 13273 1726853314.02615: Calling all_inventory to load vars for managed_node3 13273 1726853314.02619: Calling groups_inventory to load vars for managed_node3 13273 1726853314.02623: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853314.02635: Calling all_plugins_play to load vars for managed_node3 13273 1726853314.02638: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853314.02641: Calling groups_plugins_play to load vars for managed_node3 13273 1726853314.03402: done sending task result for task 02083763-bbaf-5fc3-657d-0000000000c6 13273 1726853314.03406: WORKER PROCESS EXITING 13273 1726853314.04514: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853314.06157: done with get_vars() 13273 1726853314.06181: done getting variables 13273 1726853314.06239: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Assert that the port2 device is in DOWN state] *************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:153 Friday 20 September 2024 13:28:34 -0400 (0:00:00.056) 0:00:31.951 ****** 13273 1726853314.06268: entering _queue_task() for managed_node3/assert 13273 1726853314.06728: worker is 1 (out of 1 available) 13273 1726853314.06787: exiting _queue_task() for managed_node3/assert 13273 1726853314.06799: done queuing things up, now waiting for results queue to drain 13273 1726853314.06800: waiting for pending results... 13273 1726853314.07188: running TaskExecutor() for managed_node3/TASK: Assert that the port2 device is in DOWN state 13273 1726853314.07194: in run() - task 02083763-bbaf-5fc3-657d-0000000000c7 13273 1726853314.07197: variable 'ansible_search_path' from source: unknown 13273 1726853314.07205: calling self._execute() 13273 1726853314.07256: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853314.07263: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853314.07274: variable 'omit' from source: magic vars 13273 1726853314.07756: variable 'ansible_distribution_major_version' from source: facts 13273 1726853314.07760: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853314.07778: variable 'network_provider' from source: set_fact 13273 1726853314.07793: Evaluated conditional (network_provider == "initscripts"): False 13273 1726853314.07798: when evaluation is False, skipping this task 13273 1726853314.07801: _execute() done 13273 1726853314.07804: dumping result to json 13273 1726853314.07806: done dumping result, returning 13273 1726853314.07809: done running TaskExecutor() for managed_node3/TASK: Assert that the port2 device is in DOWN state [02083763-bbaf-5fc3-657d-0000000000c7] 13273 1726853314.07815: sending task result for task 02083763-bbaf-5fc3-657d-0000000000c7 skipping: [managed_node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 13273 1726853314.08019: no more pending results, returning what we have 13273 1726853314.08023: results queue empty 13273 1726853314.08024: checking for any_errors_fatal 13273 1726853314.08030: done checking for any_errors_fatal 13273 1726853314.08031: checking for max_fail_percentage 13273 1726853314.08032: done checking for max_fail_percentage 13273 1726853314.08033: checking to see if all hosts have failed and the running result is not ok 13273 1726853314.08033: done checking to see if all hosts have failed 13273 1726853314.08034: getting the remaining hosts for this loop 13273 1726853314.08035: done getting the remaining hosts for this loop 13273 1726853314.08038: getting the next task for host managed_node3 13273 1726853314.08045: done getting next task for host managed_node3 13273 1726853314.08050: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 13273 1726853314.08053: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853314.08073: getting variables 13273 1726853314.08075: in VariableManager get_vars() 13273 1726853314.08118: Calling all_inventory to load vars for managed_node3 13273 1726853314.08121: Calling groups_inventory to load vars for managed_node3 13273 1726853314.08123: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853314.08130: Calling all_plugins_play to load vars for managed_node3 13273 1726853314.08132: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853314.08135: Calling groups_plugins_play to load vars for managed_node3 13273 1726853314.08688: done sending task result for task 02083763-bbaf-5fc3-657d-0000000000c7 13273 1726853314.08692: WORKER PROCESS EXITING 13273 1726853314.09983: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853314.11826: done with get_vars() 13273 1726853314.11848: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 13:28:34 -0400 (0:00:00.056) 0:00:32.008 ****** 13273 1726853314.11943: entering _queue_task() for managed_node3/include_tasks 13273 1726853314.12285: worker is 1 (out of 1 available) 13273 1726853314.12297: exiting _queue_task() for managed_node3/include_tasks 13273 1726853314.12309: done queuing things up, now waiting for results queue to drain 13273 1726853314.12310: waiting for pending results... 13273 1726853314.12587: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 13273 1726853314.12769: in run() - task 02083763-bbaf-5fc3-657d-0000000000cf 13273 1726853314.12796: variable 'ansible_search_path' from source: unknown 13273 1726853314.12806: variable 'ansible_search_path' from source: unknown 13273 1726853314.12855: calling self._execute() 13273 1726853314.13077: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853314.13082: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853314.13102: variable 'omit' from source: magic vars 13273 1726853314.13932: variable 'ansible_distribution_major_version' from source: facts 13273 1726853314.14052: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853314.14055: _execute() done 13273 1726853314.14057: dumping result to json 13273 1726853314.14060: done dumping result, returning 13273 1726853314.14063: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [02083763-bbaf-5fc3-657d-0000000000cf] 13273 1726853314.14065: sending task result for task 02083763-bbaf-5fc3-657d-0000000000cf 13273 1726853314.14144: done sending task result for task 02083763-bbaf-5fc3-657d-0000000000cf 13273 1726853314.14367: no more pending results, returning what we have 13273 1726853314.14375: in VariableManager get_vars() 13273 1726853314.14438: Calling all_inventory to load vars for managed_node3 13273 1726853314.14441: Calling groups_inventory to load vars for managed_node3 13273 1726853314.14444: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853314.14457: Calling all_plugins_play to load vars for managed_node3 13273 1726853314.14461: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853314.14464: Calling groups_plugins_play to load vars for managed_node3 13273 1726853314.15384: WORKER PROCESS EXITING 13273 1726853314.16033: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853314.16881: done with get_vars() 13273 1726853314.16894: variable 'ansible_search_path' from source: unknown 13273 1726853314.16895: variable 'ansible_search_path' from source: unknown 13273 1726853314.16921: we have included files to process 13273 1726853314.16921: generating all_blocks data 13273 1726853314.16923: done generating all_blocks data 13273 1726853314.16927: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 13273 1726853314.16928: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 13273 1726853314.16929: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 13273 1726853314.17307: done processing included file 13273 1726853314.17308: iterating over new_blocks loaded from include file 13273 1726853314.17309: in VariableManager get_vars() 13273 1726853314.17329: done with get_vars() 13273 1726853314.17330: filtering new block on tags 13273 1726853314.17348: done filtering new block on tags 13273 1726853314.17352: in VariableManager get_vars() 13273 1726853314.17379: done with get_vars() 13273 1726853314.17381: filtering new block on tags 13273 1726853314.17394: done filtering new block on tags 13273 1726853314.17395: in VariableManager get_vars() 13273 1726853314.17426: done with get_vars() 13273 1726853314.17429: filtering new block on tags 13273 1726853314.17449: done filtering new block on tags 13273 1726853314.17451: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node3 13273 1726853314.17456: extending task lists for all hosts with included blocks 13273 1726853314.18242: done extending task lists 13273 1726853314.18243: done processing included files 13273 1726853314.18244: results queue empty 13273 1726853314.18244: checking for any_errors_fatal 13273 1726853314.18248: done checking for any_errors_fatal 13273 1726853314.18249: checking for max_fail_percentage 13273 1726853314.18250: done checking for max_fail_percentage 13273 1726853314.18250: checking to see if all hosts have failed and the running result is not ok 13273 1726853314.18251: done checking to see if all hosts have failed 13273 1726853314.18252: getting the remaining hosts for this loop 13273 1726853314.18253: done getting the remaining hosts for this loop 13273 1726853314.18255: getting the next task for host managed_node3 13273 1726853314.18260: done getting next task for host managed_node3 13273 1726853314.18262: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 13273 1726853314.18266: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853314.18279: getting variables 13273 1726853314.18281: in VariableManager get_vars() 13273 1726853314.18300: Calling all_inventory to load vars for managed_node3 13273 1726853314.18302: Calling groups_inventory to load vars for managed_node3 13273 1726853314.18304: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853314.18309: Calling all_plugins_play to load vars for managed_node3 13273 1726853314.18311: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853314.18313: Calling groups_plugins_play to load vars for managed_node3 13273 1726853314.19078: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853314.19912: done with get_vars() 13273 1726853314.19927: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 13:28:34 -0400 (0:00:00.080) 0:00:32.088 ****** 13273 1726853314.19981: entering _queue_task() for managed_node3/setup 13273 1726853314.20215: worker is 1 (out of 1 available) 13273 1726853314.20230: exiting _queue_task() for managed_node3/setup 13273 1726853314.20241: done queuing things up, now waiting for results queue to drain 13273 1726853314.20242: waiting for pending results... 13273 1726853314.20443: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 13273 1726853314.20682: in run() - task 02083763-bbaf-5fc3-657d-000000000796 13273 1726853314.20686: variable 'ansible_search_path' from source: unknown 13273 1726853314.20688: variable 'ansible_search_path' from source: unknown 13273 1726853314.20690: calling self._execute() 13273 1726853314.20770: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853314.20790: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853314.20805: variable 'omit' from source: magic vars 13273 1726853314.21182: variable 'ansible_distribution_major_version' from source: facts 13273 1726853314.21200: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853314.21423: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13273 1726853314.23036: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13273 1726853314.23090: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13273 1726853314.23117: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13273 1726853314.23147: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13273 1726853314.23165: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13273 1726853314.23230: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853314.23252: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853314.23269: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853314.23303: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853314.23313: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853314.23351: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853314.23366: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853314.23385: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853314.23413: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853314.23424: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853314.23534: variable '__network_required_facts' from source: role '' defaults 13273 1726853314.23573: variable 'ansible_facts' from source: unknown 13273 1726853314.24307: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 13273 1726853314.24311: when evaluation is False, skipping this task 13273 1726853314.24314: _execute() done 13273 1726853314.24317: dumping result to json 13273 1726853314.24319: done dumping result, returning 13273 1726853314.24321: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [02083763-bbaf-5fc3-657d-000000000796] 13273 1726853314.24323: sending task result for task 02083763-bbaf-5fc3-657d-000000000796 13273 1726853314.24382: done sending task result for task 02083763-bbaf-5fc3-657d-000000000796 13273 1726853314.24385: WORKER PROCESS EXITING skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 13273 1726853314.24452: no more pending results, returning what we have 13273 1726853314.24456: results queue empty 13273 1726853314.24457: checking for any_errors_fatal 13273 1726853314.24459: done checking for any_errors_fatal 13273 1726853314.24459: checking for max_fail_percentage 13273 1726853314.24461: done checking for max_fail_percentage 13273 1726853314.24462: checking to see if all hosts have failed and the running result is not ok 13273 1726853314.24463: done checking to see if all hosts have failed 13273 1726853314.24463: getting the remaining hosts for this loop 13273 1726853314.24465: done getting the remaining hosts for this loop 13273 1726853314.24468: getting the next task for host managed_node3 13273 1726853314.24478: done getting next task for host managed_node3 13273 1726853314.24483: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 13273 1726853314.24486: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853314.24506: getting variables 13273 1726853314.24507: in VariableManager get_vars() 13273 1726853314.24612: Calling all_inventory to load vars for managed_node3 13273 1726853314.24615: Calling groups_inventory to load vars for managed_node3 13273 1726853314.24617: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853314.24655: Calling all_plugins_play to load vars for managed_node3 13273 1726853314.24658: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853314.24662: Calling groups_plugins_play to load vars for managed_node3 13273 1726853314.25813: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853314.26948: done with get_vars() 13273 1726853314.26963: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 13:28:34 -0400 (0:00:00.070) 0:00:32.159 ****** 13273 1726853314.27038: entering _queue_task() for managed_node3/stat 13273 1726853314.27258: worker is 1 (out of 1 available) 13273 1726853314.27273: exiting _queue_task() for managed_node3/stat 13273 1726853314.27284: done queuing things up, now waiting for results queue to drain 13273 1726853314.27285: waiting for pending results... 13273 1726853314.27470: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree 13273 1726853314.27579: in run() - task 02083763-bbaf-5fc3-657d-000000000798 13273 1726853314.27590: variable 'ansible_search_path' from source: unknown 13273 1726853314.27594: variable 'ansible_search_path' from source: unknown 13273 1726853314.27623: calling self._execute() 13273 1726853314.27703: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853314.27709: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853314.27717: variable 'omit' from source: magic vars 13273 1726853314.28000: variable 'ansible_distribution_major_version' from source: facts 13273 1726853314.28009: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853314.28119: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13273 1726853314.28576: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13273 1726853314.28580: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13273 1726853314.28582: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13273 1726853314.28584: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13273 1726853314.28618: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13273 1726853314.28648: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13273 1726853314.28679: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853314.28719: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13273 1726853314.28806: variable '__network_is_ostree' from source: set_fact 13273 1726853314.28926: Evaluated conditional (not __network_is_ostree is defined): False 13273 1726853314.28930: when evaluation is False, skipping this task 13273 1726853314.28932: _execute() done 13273 1726853314.28934: dumping result to json 13273 1726853314.28936: done dumping result, returning 13273 1726853314.28939: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree [02083763-bbaf-5fc3-657d-000000000798] 13273 1726853314.28941: sending task result for task 02083763-bbaf-5fc3-657d-000000000798 13273 1726853314.29004: done sending task result for task 02083763-bbaf-5fc3-657d-000000000798 13273 1726853314.29007: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 13273 1726853314.29085: no more pending results, returning what we have 13273 1726853314.29089: results queue empty 13273 1726853314.29090: checking for any_errors_fatal 13273 1726853314.29099: done checking for any_errors_fatal 13273 1726853314.29100: checking for max_fail_percentage 13273 1726853314.29102: done checking for max_fail_percentage 13273 1726853314.29103: checking to see if all hosts have failed and the running result is not ok 13273 1726853314.29109: done checking to see if all hosts have failed 13273 1726853314.29110: getting the remaining hosts for this loop 13273 1726853314.29112: done getting the remaining hosts for this loop 13273 1726853314.29115: getting the next task for host managed_node3 13273 1726853314.29122: done getting next task for host managed_node3 13273 1726853314.29125: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 13273 1726853314.29128: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853314.29150: getting variables 13273 1726853314.29151: in VariableManager get_vars() 13273 1726853314.29200: Calling all_inventory to load vars for managed_node3 13273 1726853314.29203: Calling groups_inventory to load vars for managed_node3 13273 1726853314.29205: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853314.29214: Calling all_plugins_play to load vars for managed_node3 13273 1726853314.29216: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853314.29219: Calling groups_plugins_play to load vars for managed_node3 13273 1726853314.30085: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853314.30927: done with get_vars() 13273 1726853314.30940: done getting variables 13273 1726853314.30981: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 13:28:34 -0400 (0:00:00.039) 0:00:32.199 ****** 13273 1726853314.31005: entering _queue_task() for managed_node3/set_fact 13273 1726853314.31212: worker is 1 (out of 1 available) 13273 1726853314.31225: exiting _queue_task() for managed_node3/set_fact 13273 1726853314.31238: done queuing things up, now waiting for results queue to drain 13273 1726853314.31239: waiting for pending results... 13273 1726853314.31410: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 13273 1726853314.31509: in run() - task 02083763-bbaf-5fc3-657d-000000000799 13273 1726853314.31520: variable 'ansible_search_path' from source: unknown 13273 1726853314.31523: variable 'ansible_search_path' from source: unknown 13273 1726853314.31552: calling self._execute() 13273 1726853314.31625: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853314.31629: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853314.31637: variable 'omit' from source: magic vars 13273 1726853314.31905: variable 'ansible_distribution_major_version' from source: facts 13273 1726853314.31911: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853314.32021: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13273 1726853314.32209: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13273 1726853314.32243: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13273 1726853314.32269: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13273 1726853314.32296: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13273 1726853314.32358: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13273 1726853314.32377: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13273 1726853314.32395: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853314.32412: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13273 1726853314.32475: variable '__network_is_ostree' from source: set_fact 13273 1726853314.32481: Evaluated conditional (not __network_is_ostree is defined): False 13273 1726853314.32484: when evaluation is False, skipping this task 13273 1726853314.32486: _execute() done 13273 1726853314.32489: dumping result to json 13273 1726853314.32492: done dumping result, returning 13273 1726853314.32499: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [02083763-bbaf-5fc3-657d-000000000799] 13273 1726853314.32503: sending task result for task 02083763-bbaf-5fc3-657d-000000000799 13273 1726853314.32578: done sending task result for task 02083763-bbaf-5fc3-657d-000000000799 13273 1726853314.32580: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 13273 1726853314.32628: no more pending results, returning what we have 13273 1726853314.32631: results queue empty 13273 1726853314.32632: checking for any_errors_fatal 13273 1726853314.32637: done checking for any_errors_fatal 13273 1726853314.32637: checking for max_fail_percentage 13273 1726853314.32639: done checking for max_fail_percentage 13273 1726853314.32639: checking to see if all hosts have failed and the running result is not ok 13273 1726853314.32640: done checking to see if all hosts have failed 13273 1726853314.32641: getting the remaining hosts for this loop 13273 1726853314.32642: done getting the remaining hosts for this loop 13273 1726853314.32646: getting the next task for host managed_node3 13273 1726853314.32654: done getting next task for host managed_node3 13273 1726853314.32657: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 13273 1726853314.32661: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853314.32683: getting variables 13273 1726853314.32685: in VariableManager get_vars() 13273 1726853314.32725: Calling all_inventory to load vars for managed_node3 13273 1726853314.32728: Calling groups_inventory to load vars for managed_node3 13273 1726853314.32730: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853314.32737: Calling all_plugins_play to load vars for managed_node3 13273 1726853314.32739: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853314.32742: Calling groups_plugins_play to load vars for managed_node3 13273 1726853314.33546: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853314.34565: done with get_vars() 13273 1726853314.34581: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 13:28:34 -0400 (0:00:00.036) 0:00:32.235 ****** 13273 1726853314.34649: entering _queue_task() for managed_node3/service_facts 13273 1726853314.34858: worker is 1 (out of 1 available) 13273 1726853314.34873: exiting _queue_task() for managed_node3/service_facts 13273 1726853314.34884: done queuing things up, now waiting for results queue to drain 13273 1726853314.34885: waiting for pending results... 13273 1726853314.35072: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running 13273 1726853314.35168: in run() - task 02083763-bbaf-5fc3-657d-00000000079b 13273 1726853314.35182: variable 'ansible_search_path' from source: unknown 13273 1726853314.35186: variable 'ansible_search_path' from source: unknown 13273 1726853314.35214: calling self._execute() 13273 1726853314.35290: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853314.35295: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853314.35303: variable 'omit' from source: magic vars 13273 1726853314.35575: variable 'ansible_distribution_major_version' from source: facts 13273 1726853314.35585: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853314.35591: variable 'omit' from source: magic vars 13273 1726853314.35639: variable 'omit' from source: magic vars 13273 1726853314.35666: variable 'omit' from source: magic vars 13273 1726853314.35697: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13273 1726853314.35723: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13273 1726853314.35738: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13273 1726853314.35753: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853314.35767: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853314.35788: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13273 1726853314.35791: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853314.35793: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853314.35859: Set connection var ansible_connection to ssh 13273 1726853314.35866: Set connection var ansible_module_compression to ZIP_DEFLATED 13273 1726853314.35875: Set connection var ansible_shell_executable to /bin/sh 13273 1726853314.35878: Set connection var ansible_shell_type to sh 13273 1726853314.35880: Set connection var ansible_pipelining to False 13273 1726853314.35887: Set connection var ansible_timeout to 10 13273 1726853314.35906: variable 'ansible_shell_executable' from source: unknown 13273 1726853314.35908: variable 'ansible_connection' from source: unknown 13273 1726853314.35911: variable 'ansible_module_compression' from source: unknown 13273 1726853314.35914: variable 'ansible_shell_type' from source: unknown 13273 1726853314.35916: variable 'ansible_shell_executable' from source: unknown 13273 1726853314.35918: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853314.35922: variable 'ansible_pipelining' from source: unknown 13273 1726853314.35924: variable 'ansible_timeout' from source: unknown 13273 1726853314.35928: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853314.36065: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 13273 1726853314.36077: variable 'omit' from source: magic vars 13273 1726853314.36080: starting attempt loop 13273 1726853314.36082: running the handler 13273 1726853314.36097: _low_level_execute_command(): starting 13273 1726853314.36102: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13273 1726853314.36622: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853314.36626: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853314.36630: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13273 1726853314.36633: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853314.36670: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853314.36681: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853314.36693: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853314.36768: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853314.38481: stdout chunk (state=3): >>>/root <<< 13273 1726853314.38677: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853314.38681: stdout chunk (state=3): >>><<< 13273 1726853314.38684: stderr chunk (state=3): >>><<< 13273 1726853314.38687: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853314.38690: _low_level_execute_command(): starting 13273 1726853314.38692: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853314.3865297-14789-15881090756583 `" && echo ansible-tmp-1726853314.3865297-14789-15881090756583="` echo /root/.ansible/tmp/ansible-tmp-1726853314.3865297-14789-15881090756583 `" ) && sleep 0' 13273 1726853314.39389: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853314.39445: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853314.39462: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853314.39488: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853314.39576: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853314.41540: stdout chunk (state=3): >>>ansible-tmp-1726853314.3865297-14789-15881090756583=/root/.ansible/tmp/ansible-tmp-1726853314.3865297-14789-15881090756583 <<< 13273 1726853314.41700: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853314.41703: stdout chunk (state=3): >>><<< 13273 1726853314.41705: stderr chunk (state=3): >>><<< 13273 1726853314.41719: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853314.3865297-14789-15881090756583=/root/.ansible/tmp/ansible-tmp-1726853314.3865297-14789-15881090756583 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853314.41780: variable 'ansible_module_compression' from source: unknown 13273 1726853314.41835: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13273pukop8ph/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 13273 1726853314.41873: variable 'ansible_facts' from source: unknown 13273 1726853314.41934: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853314.3865297-14789-15881090756583/AnsiballZ_service_facts.py 13273 1726853314.42029: Sending initial data 13273 1726853314.42033: Sent initial data (161 bytes) 13273 1726853314.42456: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853314.42460: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853314.42462: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853314.42464: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 13273 1726853314.42467: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853314.42507: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853314.42518: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853314.42589: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853314.44165: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 13273 1726853314.44170: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13273 1726853314.44216: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13273 1726853314.44274: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13273pukop8ph/tmpoy9d46rp /root/.ansible/tmp/ansible-tmp-1726853314.3865297-14789-15881090756583/AnsiballZ_service_facts.py <<< 13273 1726853314.44282: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853314.3865297-14789-15881090756583/AnsiballZ_service_facts.py" <<< 13273 1726853314.44330: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13273pukop8ph/tmpoy9d46rp" to remote "/root/.ansible/tmp/ansible-tmp-1726853314.3865297-14789-15881090756583/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853314.3865297-14789-15881090756583/AnsiballZ_service_facts.py" <<< 13273 1726853314.44955: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853314.44984: stderr chunk (state=3): >>><<< 13273 1726853314.44987: stdout chunk (state=3): >>><<< 13273 1726853314.45007: done transferring module to remote 13273 1726853314.45015: _low_level_execute_command(): starting 13273 1726853314.45019: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853314.3865297-14789-15881090756583/ /root/.ansible/tmp/ansible-tmp-1726853314.3865297-14789-15881090756583/AnsiballZ_service_facts.py && sleep 0' 13273 1726853314.45429: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853314.45432: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 13273 1726853314.45434: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853314.45437: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853314.45439: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853314.45491: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853314.45500: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853314.45552: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853314.47375: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853314.47395: stderr chunk (state=3): >>><<< 13273 1726853314.47398: stdout chunk (state=3): >>><<< 13273 1726853314.47409: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853314.47412: _low_level_execute_command(): starting 13273 1726853314.47417: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853314.3865297-14789-15881090756583/AnsiballZ_service_facts.py && sleep 0' 13273 1726853314.47822: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853314.47825: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853314.47827: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address <<< 13273 1726853314.47829: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853314.47831: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853314.47883: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853314.47893: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853314.47950: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853316.09944: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "st<<< 13273 1726853316.09966: stdout chunk (state=3): >>>opped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.<<< 13273 1726853316.09985: stdout chunk (state=3): >>>service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "st<<< 13273 1726853316.09990: stdout chunk (state=3): >>>opped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integratio<<< 13273 1726853316.10011: stdout chunk (state=3): >>>n.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 13273 1726853316.11627: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 13273 1726853316.11630: stdout chunk (state=3): >>><<< 13273 1726853316.11633: stderr chunk (state=3): >>><<< 13273 1726853316.11658: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 13273 1726853316.12477: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853314.3865297-14789-15881090756583/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13273 1726853316.12481: _low_level_execute_command(): starting 13273 1726853316.12483: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853314.3865297-14789-15881090756583/ > /dev/null 2>&1 && sleep 0' 13273 1726853316.13040: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13273 1726853316.13052: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853316.13067: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853316.13121: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853316.13186: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853316.13203: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853316.13226: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853316.13318: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853316.15283: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853316.15288: stdout chunk (state=3): >>><<< 13273 1726853316.15476: stderr chunk (state=3): >>><<< 13273 1726853316.15480: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853316.15484: handler run complete 13273 1726853316.15573: variable 'ansible_facts' from source: unknown 13273 1726853316.15737: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853316.16330: variable 'ansible_facts' from source: unknown 13273 1726853316.16456: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853316.16573: attempt loop complete, returning result 13273 1726853316.16578: _execute() done 13273 1726853316.16581: dumping result to json 13273 1726853316.16627: done dumping result, returning 13273 1726853316.16635: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running [02083763-bbaf-5fc3-657d-00000000079b] 13273 1726853316.16638: sending task result for task 02083763-bbaf-5fc3-657d-00000000079b ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 13273 1726853316.17236: no more pending results, returning what we have 13273 1726853316.17238: results queue empty 13273 1726853316.17239: checking for any_errors_fatal 13273 1726853316.17243: done checking for any_errors_fatal 13273 1726853316.17243: checking for max_fail_percentage 13273 1726853316.17245: done checking for max_fail_percentage 13273 1726853316.17246: checking to see if all hosts have failed and the running result is not ok 13273 1726853316.17246: done checking to see if all hosts have failed 13273 1726853316.17247: getting the remaining hosts for this loop 13273 1726853316.17248: done getting the remaining hosts for this loop 13273 1726853316.17251: getting the next task for host managed_node3 13273 1726853316.17256: done getting next task for host managed_node3 13273 1726853316.17259: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 13273 1726853316.17263: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853316.17275: getting variables 13273 1726853316.17276: in VariableManager get_vars() 13273 1726853316.17312: Calling all_inventory to load vars for managed_node3 13273 1726853316.17314: Calling groups_inventory to load vars for managed_node3 13273 1726853316.17316: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853316.17323: Calling all_plugins_play to load vars for managed_node3 13273 1726853316.17325: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853316.17328: Calling groups_plugins_play to load vars for managed_node3 13273 1726853316.17844: done sending task result for task 02083763-bbaf-5fc3-657d-00000000079b 13273 1726853316.17849: WORKER PROCESS EXITING 13273 1726853316.18113: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853316.23364: done with get_vars() 13273 1726853316.23384: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 13:28:36 -0400 (0:00:01.887) 0:00:34.123 ****** 13273 1726853316.23448: entering _queue_task() for managed_node3/package_facts 13273 1726853316.23713: worker is 1 (out of 1 available) 13273 1726853316.23726: exiting _queue_task() for managed_node3/package_facts 13273 1726853316.23737: done queuing things up, now waiting for results queue to drain 13273 1726853316.23738: waiting for pending results... 13273 1726853316.23930: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed 13273 1726853316.24030: in run() - task 02083763-bbaf-5fc3-657d-00000000079c 13273 1726853316.24041: variable 'ansible_search_path' from source: unknown 13273 1726853316.24048: variable 'ansible_search_path' from source: unknown 13273 1726853316.24076: calling self._execute() 13273 1726853316.24155: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853316.24161: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853316.24172: variable 'omit' from source: magic vars 13273 1726853316.24490: variable 'ansible_distribution_major_version' from source: facts 13273 1726853316.24503: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853316.24506: variable 'omit' from source: magic vars 13273 1726853316.24578: variable 'omit' from source: magic vars 13273 1726853316.24603: variable 'omit' from source: magic vars 13273 1726853316.24649: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13273 1726853316.24684: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13273 1726853316.24702: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13273 1726853316.24720: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853316.24740: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853316.24766: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13273 1726853316.24770: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853316.24774: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853316.24880: Set connection var ansible_connection to ssh 13273 1726853316.24883: Set connection var ansible_module_compression to ZIP_DEFLATED 13273 1726853316.24886: Set connection var ansible_shell_executable to /bin/sh 13273 1726853316.24888: Set connection var ansible_shell_type to sh 13273 1726853316.24891: Set connection var ansible_pipelining to False 13273 1726853316.24893: Set connection var ansible_timeout to 10 13273 1726853316.25076: variable 'ansible_shell_executable' from source: unknown 13273 1726853316.25084: variable 'ansible_connection' from source: unknown 13273 1726853316.25087: variable 'ansible_module_compression' from source: unknown 13273 1726853316.25089: variable 'ansible_shell_type' from source: unknown 13273 1726853316.25092: variable 'ansible_shell_executable' from source: unknown 13273 1726853316.25094: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853316.25096: variable 'ansible_pipelining' from source: unknown 13273 1726853316.25099: variable 'ansible_timeout' from source: unknown 13273 1726853316.25101: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853316.25104: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 13273 1726853316.25114: variable 'omit' from source: magic vars 13273 1726853316.25117: starting attempt loop 13273 1726853316.25119: running the handler 13273 1726853316.25133: _low_level_execute_command(): starting 13273 1726853316.25141: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13273 1726853316.25822: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853316.25839: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853316.25937: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853316.27656: stdout chunk (state=3): >>>/root <<< 13273 1726853316.27765: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853316.27791: stderr chunk (state=3): >>><<< 13273 1726853316.27794: stdout chunk (state=3): >>><<< 13273 1726853316.27813: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853316.27826: _low_level_execute_command(): starting 13273 1726853316.27832: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853316.278139-14863-272387306133790 `" && echo ansible-tmp-1726853316.278139-14863-272387306133790="` echo /root/.ansible/tmp/ansible-tmp-1726853316.278139-14863-272387306133790 `" ) && sleep 0' 13273 1726853316.28251: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853316.28255: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 13273 1726853316.28258: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 13273 1726853316.28261: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853316.28308: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853316.28311: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853316.28397: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853316.30360: stdout chunk (state=3): >>>ansible-tmp-1726853316.278139-14863-272387306133790=/root/.ansible/tmp/ansible-tmp-1726853316.278139-14863-272387306133790 <<< 13273 1726853316.30472: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853316.30493: stderr chunk (state=3): >>><<< 13273 1726853316.30497: stdout chunk (state=3): >>><<< 13273 1726853316.30510: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853316.278139-14863-272387306133790=/root/.ansible/tmp/ansible-tmp-1726853316.278139-14863-272387306133790 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853316.30549: variable 'ansible_module_compression' from source: unknown 13273 1726853316.30588: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13273pukop8ph/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 13273 1726853316.30640: variable 'ansible_facts' from source: unknown 13273 1726853316.30760: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853316.278139-14863-272387306133790/AnsiballZ_package_facts.py 13273 1726853316.30857: Sending initial data 13273 1726853316.30861: Sent initial data (161 bytes) 13273 1726853316.31287: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853316.31290: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 13273 1726853316.31293: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853316.31295: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853316.31298: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853316.31339: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853316.31352: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853316.31420: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853316.33059: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 13273 1726853316.33065: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13273 1726853316.33114: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13273 1726853316.33172: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13273pukop8ph/tmp6b9zbssr /root/.ansible/tmp/ansible-tmp-1726853316.278139-14863-272387306133790/AnsiballZ_package_facts.py <<< 13273 1726853316.33179: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853316.278139-14863-272387306133790/AnsiballZ_package_facts.py" <<< 13273 1726853316.33230: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13273pukop8ph/tmp6b9zbssr" to remote "/root/.ansible/tmp/ansible-tmp-1726853316.278139-14863-272387306133790/AnsiballZ_package_facts.py" <<< 13273 1726853316.33233: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853316.278139-14863-272387306133790/AnsiballZ_package_facts.py" <<< 13273 1726853316.34518: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853316.34521: stderr chunk (state=3): >>><<< 13273 1726853316.34524: stdout chunk (state=3): >>><<< 13273 1726853316.34529: done transferring module to remote 13273 1726853316.34539: _low_level_execute_command(): starting 13273 1726853316.34550: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853316.278139-14863-272387306133790/ /root/.ansible/tmp/ansible-tmp-1726853316.278139-14863-272387306133790/AnsiballZ_package_facts.py && sleep 0' 13273 1726853316.35157: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13273 1726853316.35161: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853316.35163: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853316.35185: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13273 1726853316.35191: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 13273 1726853316.35223: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13273 1726853316.35235: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 13273 1726853316.35288: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853316.35316: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853316.35338: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853316.35353: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853316.35463: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853316.37589: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853316.37593: stdout chunk (state=3): >>><<< 13273 1726853316.37595: stderr chunk (state=3): >>><<< 13273 1726853316.37598: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853316.37600: _low_level_execute_command(): starting 13273 1726853316.37603: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853316.278139-14863-272387306133790/AnsiballZ_package_facts.py && sleep 0' 13273 1726853316.38042: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13273 1726853316.38051: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853316.38055: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853316.38069: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13273 1726853316.38084: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 13273 1726853316.38091: stderr chunk (state=3): >>>debug2: match not found <<< 13273 1726853316.38100: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853316.38118: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13273 1726853316.38126: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address <<< 13273 1726853316.38133: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13273 1726853316.38153: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853316.38162: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853316.38164: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13273 1726853316.38179: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 13273 1726853316.38182: stderr chunk (state=3): >>>debug2: match found <<< 13273 1726853316.38197: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853316.38262: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853316.38273: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853316.38292: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853316.38388: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853316.83323: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks"<<< 13273 1726853316.83338: stdout chunk (state=3): >>>: [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "rele<<< 13273 1726853316.83350: stdout chunk (state=3): >>>ase": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "rele<<< 13273 1726853316.83385: stdout chunk (state=3): >>>ase": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10",<<< 13273 1726853316.83442: stdout chunk (state=3): >>> "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arc<<< 13273 1726853316.83450: stdout chunk (state=3): >>>h": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"<<< 13273 1726853316.83479: stdout chunk (state=3): >>>}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1<<< 13273 1726853316.83490: stdout chunk (state=3): >>>.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10<<< 13273 1726853316.83516: stdout chunk (state=3): >>>", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 13273 1726853316.85353: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 13273 1726853316.85358: stderr chunk (state=3): >>><<< 13273 1726853316.85361: stdout chunk (state=3): >>><<< 13273 1726853316.85407: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 13273 1726853316.87049: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853316.278139-14863-272387306133790/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13273 1726853316.87066: _low_level_execute_command(): starting 13273 1726853316.87070: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853316.278139-14863-272387306133790/ > /dev/null 2>&1 && sleep 0' 13273 1726853316.87506: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853316.87512: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 13273 1726853316.87515: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853316.87517: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853316.87519: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853316.87568: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853316.87577: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853316.87643: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853316.89527: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853316.89551: stderr chunk (state=3): >>><<< 13273 1726853316.89554: stdout chunk (state=3): >>><<< 13273 1726853316.89564: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853316.89570: handler run complete 13273 1726853316.90114: variable 'ansible_facts' from source: unknown 13273 1726853316.90374: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853316.91459: variable 'ansible_facts' from source: unknown 13273 1726853316.91696: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853316.92075: attempt loop complete, returning result 13273 1726853316.92084: _execute() done 13273 1726853316.92087: dumping result to json 13273 1726853316.92202: done dumping result, returning 13273 1726853316.92211: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed [02083763-bbaf-5fc3-657d-00000000079c] 13273 1726853316.92214: sending task result for task 02083763-bbaf-5fc3-657d-00000000079c 13273 1726853316.93984: done sending task result for task 02083763-bbaf-5fc3-657d-00000000079c 13273 1726853316.93987: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 13273 1726853316.94077: no more pending results, returning what we have 13273 1726853316.94079: results queue empty 13273 1726853316.94079: checking for any_errors_fatal 13273 1726853316.94083: done checking for any_errors_fatal 13273 1726853316.94083: checking for max_fail_percentage 13273 1726853316.94084: done checking for max_fail_percentage 13273 1726853316.94084: checking to see if all hosts have failed and the running result is not ok 13273 1726853316.94085: done checking to see if all hosts have failed 13273 1726853316.94085: getting the remaining hosts for this loop 13273 1726853316.94086: done getting the remaining hosts for this loop 13273 1726853316.94089: getting the next task for host managed_node3 13273 1726853316.94093: done getting next task for host managed_node3 13273 1726853316.94095: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 13273 1726853316.94097: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853316.94105: getting variables 13273 1726853316.94106: in VariableManager get_vars() 13273 1726853316.94137: Calling all_inventory to load vars for managed_node3 13273 1726853316.94138: Calling groups_inventory to load vars for managed_node3 13273 1726853316.94140: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853316.94148: Calling all_plugins_play to load vars for managed_node3 13273 1726853316.94150: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853316.94152: Calling groups_plugins_play to load vars for managed_node3 13273 1726853316.94880: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853316.96122: done with get_vars() 13273 1726853316.96142: done getting variables 13273 1726853316.96201: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 13:28:36 -0400 (0:00:00.727) 0:00:34.851 ****** 13273 1726853316.96236: entering _queue_task() for managed_node3/debug 13273 1726853316.96535: worker is 1 (out of 1 available) 13273 1726853316.96550: exiting _queue_task() for managed_node3/debug 13273 1726853316.96563: done queuing things up, now waiting for results queue to drain 13273 1726853316.96564: waiting for pending results... 13273 1726853316.96987: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider 13273 1726853316.96992: in run() - task 02083763-bbaf-5fc3-657d-0000000000d0 13273 1726853316.97005: variable 'ansible_search_path' from source: unknown 13273 1726853316.97011: variable 'ansible_search_path' from source: unknown 13273 1726853316.97056: calling self._execute() 13273 1726853316.97164: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853316.97179: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853316.97195: variable 'omit' from source: magic vars 13273 1726853316.97588: variable 'ansible_distribution_major_version' from source: facts 13273 1726853316.97604: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853316.97614: variable 'omit' from source: magic vars 13273 1726853316.97682: variable 'omit' from source: magic vars 13273 1726853316.97785: variable 'network_provider' from source: set_fact 13273 1726853316.97806: variable 'omit' from source: magic vars 13273 1726853316.97851: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13273 1726853316.97897: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13273 1726853316.97918: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13273 1726853316.97938: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853316.97976: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853316.97993: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13273 1726853316.98000: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853316.98007: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853316.98175: Set connection var ansible_connection to ssh 13273 1726853316.98178: Set connection var ansible_module_compression to ZIP_DEFLATED 13273 1726853316.98181: Set connection var ansible_shell_executable to /bin/sh 13273 1726853316.98183: Set connection var ansible_shell_type to sh 13273 1726853316.98185: Set connection var ansible_pipelining to False 13273 1726853316.98193: Set connection var ansible_timeout to 10 13273 1726853316.98195: variable 'ansible_shell_executable' from source: unknown 13273 1726853316.98197: variable 'ansible_connection' from source: unknown 13273 1726853316.98199: variable 'ansible_module_compression' from source: unknown 13273 1726853316.98201: variable 'ansible_shell_type' from source: unknown 13273 1726853316.98203: variable 'ansible_shell_executable' from source: unknown 13273 1726853316.98204: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853316.98212: variable 'ansible_pipelining' from source: unknown 13273 1726853316.98218: variable 'ansible_timeout' from source: unknown 13273 1726853316.98226: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853316.98367: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13273 1726853316.98384: variable 'omit' from source: magic vars 13273 1726853316.98394: starting attempt loop 13273 1726853316.98401: running the handler 13273 1726853316.98453: handler run complete 13273 1726853316.98472: attempt loop complete, returning result 13273 1726853316.98518: _execute() done 13273 1726853316.98521: dumping result to json 13273 1726853316.98523: done dumping result, returning 13273 1726853316.98525: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider [02083763-bbaf-5fc3-657d-0000000000d0] 13273 1726853316.98528: sending task result for task 02083763-bbaf-5fc3-657d-0000000000d0 ok: [managed_node3] => {} MSG: Using network provider: nm 13273 1726853316.98692: no more pending results, returning what we have 13273 1726853316.98696: results queue empty 13273 1726853316.98697: checking for any_errors_fatal 13273 1726853316.98707: done checking for any_errors_fatal 13273 1726853316.98707: checking for max_fail_percentage 13273 1726853316.98709: done checking for max_fail_percentage 13273 1726853316.98710: checking to see if all hosts have failed and the running result is not ok 13273 1726853316.98711: done checking to see if all hosts have failed 13273 1726853316.98711: getting the remaining hosts for this loop 13273 1726853316.98712: done getting the remaining hosts for this loop 13273 1726853316.98716: getting the next task for host managed_node3 13273 1726853316.98724: done getting next task for host managed_node3 13273 1726853316.98727: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 13273 1726853316.98730: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853316.98743: getting variables 13273 1726853316.98747: in VariableManager get_vars() 13273 1726853316.98802: Calling all_inventory to load vars for managed_node3 13273 1726853316.98805: Calling groups_inventory to load vars for managed_node3 13273 1726853316.98808: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853316.98818: Calling all_plugins_play to load vars for managed_node3 13273 1726853316.98822: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853316.98825: Calling groups_plugins_play to load vars for managed_node3 13273 1726853316.99485: done sending task result for task 02083763-bbaf-5fc3-657d-0000000000d0 13273 1726853316.99489: WORKER PROCESS EXITING 13273 1726853317.00458: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853317.02054: done with get_vars() 13273 1726853317.02085: done getting variables 13273 1726853317.02151: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 13:28:37 -0400 (0:00:00.059) 0:00:34.911 ****** 13273 1726853317.02189: entering _queue_task() for managed_node3/fail 13273 1726853317.02561: worker is 1 (out of 1 available) 13273 1726853317.02629: exiting _queue_task() for managed_node3/fail 13273 1726853317.02650: done queuing things up, now waiting for results queue to drain 13273 1726853317.02652: waiting for pending results... 13273 1726853317.02880: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 13273 1726853317.03017: in run() - task 02083763-bbaf-5fc3-657d-0000000000d1 13273 1726853317.03038: variable 'ansible_search_path' from source: unknown 13273 1726853317.03048: variable 'ansible_search_path' from source: unknown 13273 1726853317.03091: calling self._execute() 13273 1726853317.03196: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853317.03208: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853317.03222: variable 'omit' from source: magic vars 13273 1726853317.03617: variable 'ansible_distribution_major_version' from source: facts 13273 1726853317.03623: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853317.03709: variable 'network_state' from source: role '' defaults 13273 1726853317.03717: Evaluated conditional (network_state != {}): False 13273 1726853317.03725: when evaluation is False, skipping this task 13273 1726853317.03728: _execute() done 13273 1726853317.03731: dumping result to json 13273 1726853317.03734: done dumping result, returning 13273 1726853317.03736: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [02083763-bbaf-5fc3-657d-0000000000d1] 13273 1726853317.03739: sending task result for task 02083763-bbaf-5fc3-657d-0000000000d1 13273 1726853317.03832: done sending task result for task 02083763-bbaf-5fc3-657d-0000000000d1 13273 1726853317.03835: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 13273 1726853317.03883: no more pending results, returning what we have 13273 1726853317.03887: results queue empty 13273 1726853317.03888: checking for any_errors_fatal 13273 1726853317.03895: done checking for any_errors_fatal 13273 1726853317.03895: checking for max_fail_percentage 13273 1726853317.03897: done checking for max_fail_percentage 13273 1726853317.03898: checking to see if all hosts have failed and the running result is not ok 13273 1726853317.03898: done checking to see if all hosts have failed 13273 1726853317.03899: getting the remaining hosts for this loop 13273 1726853317.03900: done getting the remaining hosts for this loop 13273 1726853317.03903: getting the next task for host managed_node3 13273 1726853317.03910: done getting next task for host managed_node3 13273 1726853317.03913: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 13273 1726853317.03916: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853317.03937: getting variables 13273 1726853317.03939: in VariableManager get_vars() 13273 1726853317.03987: Calling all_inventory to load vars for managed_node3 13273 1726853317.03990: Calling groups_inventory to load vars for managed_node3 13273 1726853317.03992: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853317.04000: Calling all_plugins_play to load vars for managed_node3 13273 1726853317.04003: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853317.04005: Calling groups_plugins_play to load vars for managed_node3 13273 1726853317.04888: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853317.06040: done with get_vars() 13273 1726853317.06060: done getting variables 13273 1726853317.06118: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 13:28:37 -0400 (0:00:00.039) 0:00:34.950 ****** 13273 1726853317.06150: entering _queue_task() for managed_node3/fail 13273 1726853317.06470: worker is 1 (out of 1 available) 13273 1726853317.06485: exiting _queue_task() for managed_node3/fail 13273 1726853317.06497: done queuing things up, now waiting for results queue to drain 13273 1726853317.06498: waiting for pending results... 13273 1726853317.06700: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 13273 1726853317.06801: in run() - task 02083763-bbaf-5fc3-657d-0000000000d2 13273 1726853317.06812: variable 'ansible_search_path' from source: unknown 13273 1726853317.06816: variable 'ansible_search_path' from source: unknown 13273 1726853317.06847: calling self._execute() 13273 1726853317.06929: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853317.06933: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853317.06943: variable 'omit' from source: magic vars 13273 1726853317.07234: variable 'ansible_distribution_major_version' from source: facts 13273 1726853317.07242: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853317.07329: variable 'network_state' from source: role '' defaults 13273 1726853317.07337: Evaluated conditional (network_state != {}): False 13273 1726853317.07340: when evaluation is False, skipping this task 13273 1726853317.07343: _execute() done 13273 1726853317.07348: dumping result to json 13273 1726853317.07350: done dumping result, returning 13273 1726853317.07355: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [02083763-bbaf-5fc3-657d-0000000000d2] 13273 1726853317.07361: sending task result for task 02083763-bbaf-5fc3-657d-0000000000d2 13273 1726853317.07447: done sending task result for task 02083763-bbaf-5fc3-657d-0000000000d2 13273 1726853317.07450: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 13273 1726853317.07524: no more pending results, returning what we have 13273 1726853317.07527: results queue empty 13273 1726853317.07528: checking for any_errors_fatal 13273 1726853317.07534: done checking for any_errors_fatal 13273 1726853317.07535: checking for max_fail_percentage 13273 1726853317.07536: done checking for max_fail_percentage 13273 1726853317.07537: checking to see if all hosts have failed and the running result is not ok 13273 1726853317.07538: done checking to see if all hosts have failed 13273 1726853317.07538: getting the remaining hosts for this loop 13273 1726853317.07540: done getting the remaining hosts for this loop 13273 1726853317.07543: getting the next task for host managed_node3 13273 1726853317.07551: done getting next task for host managed_node3 13273 1726853317.07554: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 13273 1726853317.07556: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853317.07577: getting variables 13273 1726853317.07578: in VariableManager get_vars() 13273 1726853317.07619: Calling all_inventory to load vars for managed_node3 13273 1726853317.07621: Calling groups_inventory to load vars for managed_node3 13273 1726853317.07623: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853317.07631: Calling all_plugins_play to load vars for managed_node3 13273 1726853317.07633: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853317.07636: Calling groups_plugins_play to load vars for managed_node3 13273 1726853317.08366: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853317.09788: done with get_vars() 13273 1726853317.09810: done getting variables 13273 1726853317.09883: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 13:28:37 -0400 (0:00:00.037) 0:00:34.988 ****** 13273 1726853317.09921: entering _queue_task() for managed_node3/fail 13273 1726853317.10163: worker is 1 (out of 1 available) 13273 1726853317.10180: exiting _queue_task() for managed_node3/fail 13273 1726853317.10193: done queuing things up, now waiting for results queue to drain 13273 1726853317.10194: waiting for pending results... 13273 1726853317.10394: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 13273 1726853317.10472: in run() - task 02083763-bbaf-5fc3-657d-0000000000d3 13273 1726853317.10485: variable 'ansible_search_path' from source: unknown 13273 1726853317.10489: variable 'ansible_search_path' from source: unknown 13273 1726853317.10516: calling self._execute() 13273 1726853317.10595: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853317.10598: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853317.10607: variable 'omit' from source: magic vars 13273 1726853317.10889: variable 'ansible_distribution_major_version' from source: facts 13273 1726853317.10898: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853317.11013: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13273 1726853317.13277: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13273 1726853317.13281: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13273 1726853317.13283: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13273 1726853317.13285: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13273 1726853317.13308: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13273 1726853317.13392: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853317.13427: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853317.13459: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853317.13505: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853317.13525: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853317.13623: variable 'ansible_distribution_major_version' from source: facts 13273 1726853317.13651: Evaluated conditional (ansible_distribution_major_version | int > 9): True 13273 1726853317.13785: variable 'ansible_distribution' from source: facts 13273 1726853317.13796: variable '__network_rh_distros' from source: role '' defaults 13273 1726853317.13811: Evaluated conditional (ansible_distribution in __network_rh_distros): True 13273 1726853317.14021: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853317.14039: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853317.14062: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853317.14098: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853317.14109: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853317.14141: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853317.14159: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853317.14179: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853317.14203: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853317.14213: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853317.14243: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853317.14260: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853317.14278: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853317.14305: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853317.14315: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853317.14510: variable 'network_connections' from source: task vars 13273 1726853317.14520: variable 'controller_profile' from source: play vars 13273 1726853317.14565: variable 'controller_profile' from source: play vars 13273 1726853317.14574: variable 'controller_device' from source: play vars 13273 1726853317.14618: variable 'controller_device' from source: play vars 13273 1726853317.14627: variable 'port1_profile' from source: play vars 13273 1726853317.14668: variable 'port1_profile' from source: play vars 13273 1726853317.14676: variable 'dhcp_interface1' from source: play vars 13273 1726853317.14721: variable 'dhcp_interface1' from source: play vars 13273 1726853317.14724: variable 'controller_profile' from source: play vars 13273 1726853317.14764: variable 'controller_profile' from source: play vars 13273 1726853317.14770: variable 'port2_profile' from source: play vars 13273 1726853317.14812: variable 'port2_profile' from source: play vars 13273 1726853317.14817: variable 'dhcp_interface2' from source: play vars 13273 1726853317.14862: variable 'dhcp_interface2' from source: play vars 13273 1726853317.14868: variable 'controller_profile' from source: play vars 13273 1726853317.14912: variable 'controller_profile' from source: play vars 13273 1726853317.14922: variable 'network_state' from source: role '' defaults 13273 1726853317.14967: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13273 1726853317.15082: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13273 1726853317.15108: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13273 1726853317.15130: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13273 1726853317.15155: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13273 1726853317.15187: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13273 1726853317.15202: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13273 1726853317.15219: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853317.15236: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13273 1726853317.15270: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 13273 1726853317.15275: when evaluation is False, skipping this task 13273 1726853317.15277: _execute() done 13273 1726853317.15279: dumping result to json 13273 1726853317.15281: done dumping result, returning 13273 1726853317.15288: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [02083763-bbaf-5fc3-657d-0000000000d3] 13273 1726853317.15292: sending task result for task 02083763-bbaf-5fc3-657d-0000000000d3 13273 1726853317.15377: done sending task result for task 02083763-bbaf-5fc3-657d-0000000000d3 13273 1726853317.15379: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 13273 1726853317.15429: no more pending results, returning what we have 13273 1726853317.15432: results queue empty 13273 1726853317.15433: checking for any_errors_fatal 13273 1726853317.15440: done checking for any_errors_fatal 13273 1726853317.15441: checking for max_fail_percentage 13273 1726853317.15443: done checking for max_fail_percentage 13273 1726853317.15444: checking to see if all hosts have failed and the running result is not ok 13273 1726853317.15446: done checking to see if all hosts have failed 13273 1726853317.15447: getting the remaining hosts for this loop 13273 1726853317.15449: done getting the remaining hosts for this loop 13273 1726853317.15452: getting the next task for host managed_node3 13273 1726853317.15458: done getting next task for host managed_node3 13273 1726853317.15462: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 13273 1726853317.15464: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853317.15485: getting variables 13273 1726853317.15486: in VariableManager get_vars() 13273 1726853317.15541: Calling all_inventory to load vars for managed_node3 13273 1726853317.15544: Calling groups_inventory to load vars for managed_node3 13273 1726853317.15549: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853317.15557: Calling all_plugins_play to load vars for managed_node3 13273 1726853317.15560: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853317.15562: Calling groups_plugins_play to load vars for managed_node3 13273 1726853317.16848: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853317.17717: done with get_vars() 13273 1726853317.17733: done getting variables 13273 1726853317.17779: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 13:28:37 -0400 (0:00:00.078) 0:00:35.067 ****** 13273 1726853317.17805: entering _queue_task() for managed_node3/dnf 13273 1726853317.18055: worker is 1 (out of 1 available) 13273 1726853317.18069: exiting _queue_task() for managed_node3/dnf 13273 1726853317.18082: done queuing things up, now waiting for results queue to drain 13273 1726853317.18083: waiting for pending results... 13273 1726853317.18263: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 13273 1726853317.18348: in run() - task 02083763-bbaf-5fc3-657d-0000000000d4 13273 1726853317.18357: variable 'ansible_search_path' from source: unknown 13273 1726853317.18360: variable 'ansible_search_path' from source: unknown 13273 1726853317.18392: calling self._execute() 13273 1726853317.18472: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853317.18476: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853317.18485: variable 'omit' from source: magic vars 13273 1726853317.18757: variable 'ansible_distribution_major_version' from source: facts 13273 1726853317.18766: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853317.18900: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13273 1726853317.20388: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13273 1726853317.20429: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13273 1726853317.20456: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13273 1726853317.20486: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13273 1726853317.20506: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13273 1726853317.20562: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853317.20596: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853317.20614: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853317.20640: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853317.20652: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853317.20733: variable 'ansible_distribution' from source: facts 13273 1726853317.20736: variable 'ansible_distribution_major_version' from source: facts 13273 1726853317.20750: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 13273 1726853317.20825: variable '__network_wireless_connections_defined' from source: role '' defaults 13273 1726853317.20909: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853317.20927: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853317.20944: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853317.20969: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853317.20982: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853317.21009: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853317.21031: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853317.21048: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853317.21072: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853317.21090: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853317.21117: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853317.21137: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853317.21152: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853317.21178: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853317.21188: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853317.21285: variable 'network_connections' from source: task vars 13273 1726853317.21296: variable 'controller_profile' from source: play vars 13273 1726853317.21337: variable 'controller_profile' from source: play vars 13273 1726853317.21348: variable 'controller_device' from source: play vars 13273 1726853317.21391: variable 'controller_device' from source: play vars 13273 1726853317.21399: variable 'port1_profile' from source: play vars 13273 1726853317.21439: variable 'port1_profile' from source: play vars 13273 1726853317.21449: variable 'dhcp_interface1' from source: play vars 13273 1726853317.21491: variable 'dhcp_interface1' from source: play vars 13273 1726853317.21497: variable 'controller_profile' from source: play vars 13273 1726853317.21537: variable 'controller_profile' from source: play vars 13273 1726853317.21543: variable 'port2_profile' from source: play vars 13273 1726853317.21589: variable 'port2_profile' from source: play vars 13273 1726853317.21595: variable 'dhcp_interface2' from source: play vars 13273 1726853317.21636: variable 'dhcp_interface2' from source: play vars 13273 1726853317.21642: variable 'controller_profile' from source: play vars 13273 1726853317.21687: variable 'controller_profile' from source: play vars 13273 1726853317.21732: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13273 1726853317.21843: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13273 1726853317.21870: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13273 1726853317.21897: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13273 1726853317.21918: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13273 1726853317.21950: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13273 1726853317.21974: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13273 1726853317.21993: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853317.22014: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13273 1726853317.22060: variable '__network_team_connections_defined' from source: role '' defaults 13273 1726853317.22207: variable 'network_connections' from source: task vars 13273 1726853317.22210: variable 'controller_profile' from source: play vars 13273 1726853317.22257: variable 'controller_profile' from source: play vars 13273 1726853317.22263: variable 'controller_device' from source: play vars 13273 1726853317.22305: variable 'controller_device' from source: play vars 13273 1726853317.22312: variable 'port1_profile' from source: play vars 13273 1726853317.22355: variable 'port1_profile' from source: play vars 13273 1726853317.22361: variable 'dhcp_interface1' from source: play vars 13273 1726853317.22402: variable 'dhcp_interface1' from source: play vars 13273 1726853317.22408: variable 'controller_profile' from source: play vars 13273 1726853317.22477: variable 'controller_profile' from source: play vars 13273 1726853317.22480: variable 'port2_profile' from source: play vars 13273 1726853317.22496: variable 'port2_profile' from source: play vars 13273 1726853317.22502: variable 'dhcp_interface2' from source: play vars 13273 1726853317.22543: variable 'dhcp_interface2' from source: play vars 13273 1726853317.22550: variable 'controller_profile' from source: play vars 13273 1726853317.22591: variable 'controller_profile' from source: play vars 13273 1726853317.22615: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 13273 1726853317.22618: when evaluation is False, skipping this task 13273 1726853317.22621: _execute() done 13273 1726853317.22624: dumping result to json 13273 1726853317.22626: done dumping result, returning 13273 1726853317.22634: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [02083763-bbaf-5fc3-657d-0000000000d4] 13273 1726853317.22638: sending task result for task 02083763-bbaf-5fc3-657d-0000000000d4 13273 1726853317.22730: done sending task result for task 02083763-bbaf-5fc3-657d-0000000000d4 13273 1726853317.22733: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 13273 1726853317.22823: no more pending results, returning what we have 13273 1726853317.22826: results queue empty 13273 1726853317.22827: checking for any_errors_fatal 13273 1726853317.22833: done checking for any_errors_fatal 13273 1726853317.22834: checking for max_fail_percentage 13273 1726853317.22836: done checking for max_fail_percentage 13273 1726853317.22836: checking to see if all hosts have failed and the running result is not ok 13273 1726853317.22837: done checking to see if all hosts have failed 13273 1726853317.22838: getting the remaining hosts for this loop 13273 1726853317.22839: done getting the remaining hosts for this loop 13273 1726853317.22842: getting the next task for host managed_node3 13273 1726853317.22851: done getting next task for host managed_node3 13273 1726853317.22855: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 13273 1726853317.22857: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853317.22880: getting variables 13273 1726853317.22882: in VariableManager get_vars() 13273 1726853317.22926: Calling all_inventory to load vars for managed_node3 13273 1726853317.22929: Calling groups_inventory to load vars for managed_node3 13273 1726853317.22931: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853317.22938: Calling all_plugins_play to load vars for managed_node3 13273 1726853317.22941: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853317.22943: Calling groups_plugins_play to load vars for managed_node3 13273 1726853317.23727: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853317.24704: done with get_vars() 13273 1726853317.24722: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 13273 1726853317.24780: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 13:28:37 -0400 (0:00:00.069) 0:00:35.137 ****** 13273 1726853317.24804: entering _queue_task() for managed_node3/yum 13273 1726853317.25064: worker is 1 (out of 1 available) 13273 1726853317.25078: exiting _queue_task() for managed_node3/yum 13273 1726853317.25091: done queuing things up, now waiting for results queue to drain 13273 1726853317.25092: waiting for pending results... 13273 1726853317.25278: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 13273 1726853317.25378: in run() - task 02083763-bbaf-5fc3-657d-0000000000d5 13273 1726853317.25391: variable 'ansible_search_path' from source: unknown 13273 1726853317.25395: variable 'ansible_search_path' from source: unknown 13273 1726853317.25427: calling self._execute() 13273 1726853317.25508: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853317.25512: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853317.25521: variable 'omit' from source: magic vars 13273 1726853317.25808: variable 'ansible_distribution_major_version' from source: facts 13273 1726853317.25818: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853317.25941: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13273 1726853317.27468: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13273 1726853317.27511: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13273 1726853317.27537: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13273 1726853317.27564: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13273 1726853317.27590: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13273 1726853317.27642: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853317.27675: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853317.27698: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853317.27722: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853317.27732: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853317.27808: variable 'ansible_distribution_major_version' from source: facts 13273 1726853317.27816: Evaluated conditional (ansible_distribution_major_version | int < 8): False 13273 1726853317.27819: when evaluation is False, skipping this task 13273 1726853317.27823: _execute() done 13273 1726853317.27826: dumping result to json 13273 1726853317.27829: done dumping result, returning 13273 1726853317.27837: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [02083763-bbaf-5fc3-657d-0000000000d5] 13273 1726853317.27839: sending task result for task 02083763-bbaf-5fc3-657d-0000000000d5 13273 1726853317.27928: done sending task result for task 02083763-bbaf-5fc3-657d-0000000000d5 13273 1726853317.27931: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 13273 1726853317.27986: no more pending results, returning what we have 13273 1726853317.27989: results queue empty 13273 1726853317.27990: checking for any_errors_fatal 13273 1726853317.27996: done checking for any_errors_fatal 13273 1726853317.27996: checking for max_fail_percentage 13273 1726853317.27998: done checking for max_fail_percentage 13273 1726853317.27999: checking to see if all hosts have failed and the running result is not ok 13273 1726853317.27999: done checking to see if all hosts have failed 13273 1726853317.28000: getting the remaining hosts for this loop 13273 1726853317.28001: done getting the remaining hosts for this loop 13273 1726853317.28004: getting the next task for host managed_node3 13273 1726853317.28011: done getting next task for host managed_node3 13273 1726853317.28015: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 13273 1726853317.28017: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853317.28038: getting variables 13273 1726853317.28039: in VariableManager get_vars() 13273 1726853317.28092: Calling all_inventory to load vars for managed_node3 13273 1726853317.28095: Calling groups_inventory to load vars for managed_node3 13273 1726853317.28097: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853317.28105: Calling all_plugins_play to load vars for managed_node3 13273 1726853317.28107: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853317.28109: Calling groups_plugins_play to load vars for managed_node3 13273 1726853317.28879: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853317.29740: done with get_vars() 13273 1726853317.29757: done getting variables 13273 1726853317.29800: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 13:28:37 -0400 (0:00:00.050) 0:00:35.187 ****** 13273 1726853317.29825: entering _queue_task() for managed_node3/fail 13273 1726853317.30054: worker is 1 (out of 1 available) 13273 1726853317.30069: exiting _queue_task() for managed_node3/fail 13273 1726853317.30084: done queuing things up, now waiting for results queue to drain 13273 1726853317.30085: waiting for pending results... 13273 1726853317.30258: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 13273 1726853317.30351: in run() - task 02083763-bbaf-5fc3-657d-0000000000d6 13273 1726853317.30363: variable 'ansible_search_path' from source: unknown 13273 1726853317.30367: variable 'ansible_search_path' from source: unknown 13273 1726853317.30403: calling self._execute() 13273 1726853317.30486: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853317.30490: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853317.30498: variable 'omit' from source: magic vars 13273 1726853317.30768: variable 'ansible_distribution_major_version' from source: facts 13273 1726853317.30780: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853317.30859: variable '__network_wireless_connections_defined' from source: role '' defaults 13273 1726853317.30990: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13273 1726853317.32452: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13273 1726853317.32499: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13273 1726853317.32525: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13273 1726853317.32551: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13273 1726853317.32568: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13273 1726853317.32629: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853317.32895: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853317.32918: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853317.32944: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853317.32956: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853317.32991: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853317.33008: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853317.33029: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853317.33057: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853317.33067: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853317.33097: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853317.33112: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853317.33129: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853317.33161: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853317.33173: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853317.33286: variable 'network_connections' from source: task vars 13273 1726853317.33296: variable 'controller_profile' from source: play vars 13273 1726853317.33344: variable 'controller_profile' from source: play vars 13273 1726853317.33356: variable 'controller_device' from source: play vars 13273 1726853317.33398: variable 'controller_device' from source: play vars 13273 1726853317.33407: variable 'port1_profile' from source: play vars 13273 1726853317.33448: variable 'port1_profile' from source: play vars 13273 1726853317.33461: variable 'dhcp_interface1' from source: play vars 13273 1726853317.33502: variable 'dhcp_interface1' from source: play vars 13273 1726853317.33508: variable 'controller_profile' from source: play vars 13273 1726853317.33548: variable 'controller_profile' from source: play vars 13273 1726853317.33557: variable 'port2_profile' from source: play vars 13273 1726853317.33602: variable 'port2_profile' from source: play vars 13273 1726853317.33608: variable 'dhcp_interface2' from source: play vars 13273 1726853317.33647: variable 'dhcp_interface2' from source: play vars 13273 1726853317.33656: variable 'controller_profile' from source: play vars 13273 1726853317.33700: variable 'controller_profile' from source: play vars 13273 1726853317.33745: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13273 1726853317.33857: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13273 1726853317.33884: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13273 1726853317.33909: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13273 1726853317.33930: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13273 1726853317.33963: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13273 1726853317.33979: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13273 1726853317.33998: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853317.34018: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13273 1726853317.34070: variable '__network_team_connections_defined' from source: role '' defaults 13273 1726853317.34222: variable 'network_connections' from source: task vars 13273 1726853317.34226: variable 'controller_profile' from source: play vars 13273 1726853317.34272: variable 'controller_profile' from source: play vars 13273 1726853317.34278: variable 'controller_device' from source: play vars 13273 1726853317.34318: variable 'controller_device' from source: play vars 13273 1726853317.34326: variable 'port1_profile' from source: play vars 13273 1726853317.34373: variable 'port1_profile' from source: play vars 13273 1726853317.34379: variable 'dhcp_interface1' from source: play vars 13273 1726853317.34428: variable 'dhcp_interface1' from source: play vars 13273 1726853317.34434: variable 'controller_profile' from source: play vars 13273 1726853317.34481: variable 'controller_profile' from source: play vars 13273 1726853317.34487: variable 'port2_profile' from source: play vars 13273 1726853317.34527: variable 'port2_profile' from source: play vars 13273 1726853317.34533: variable 'dhcp_interface2' from source: play vars 13273 1726853317.34580: variable 'dhcp_interface2' from source: play vars 13273 1726853317.34585: variable 'controller_profile' from source: play vars 13273 1726853317.34627: variable 'controller_profile' from source: play vars 13273 1726853317.34652: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 13273 1726853317.34655: when evaluation is False, skipping this task 13273 1726853317.34658: _execute() done 13273 1726853317.34662: dumping result to json 13273 1726853317.34664: done dumping result, returning 13273 1726853317.34676: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [02083763-bbaf-5fc3-657d-0000000000d6] 13273 1726853317.34678: sending task result for task 02083763-bbaf-5fc3-657d-0000000000d6 13273 1726853317.34759: done sending task result for task 02083763-bbaf-5fc3-657d-0000000000d6 13273 1726853317.34761: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 13273 1726853317.34820: no more pending results, returning what we have 13273 1726853317.34823: results queue empty 13273 1726853317.34824: checking for any_errors_fatal 13273 1726853317.34830: done checking for any_errors_fatal 13273 1726853317.34831: checking for max_fail_percentage 13273 1726853317.34832: done checking for max_fail_percentage 13273 1726853317.34833: checking to see if all hosts have failed and the running result is not ok 13273 1726853317.34834: done checking to see if all hosts have failed 13273 1726853317.34834: getting the remaining hosts for this loop 13273 1726853317.34836: done getting the remaining hosts for this loop 13273 1726853317.34839: getting the next task for host managed_node3 13273 1726853317.34845: done getting next task for host managed_node3 13273 1726853317.34848: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 13273 1726853317.34851: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853317.34873: getting variables 13273 1726853317.34874: in VariableManager get_vars() 13273 1726853317.34924: Calling all_inventory to load vars for managed_node3 13273 1726853317.34927: Calling groups_inventory to load vars for managed_node3 13273 1726853317.34930: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853317.34938: Calling all_plugins_play to load vars for managed_node3 13273 1726853317.34941: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853317.34943: Calling groups_plugins_play to load vars for managed_node3 13273 1726853317.35860: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853317.36711: done with get_vars() 13273 1726853317.36726: done getting variables 13273 1726853317.36769: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 13:28:37 -0400 (0:00:00.069) 0:00:35.257 ****** 13273 1726853317.36795: entering _queue_task() for managed_node3/package 13273 1726853317.37037: worker is 1 (out of 1 available) 13273 1726853317.37050: exiting _queue_task() for managed_node3/package 13273 1726853317.37062: done queuing things up, now waiting for results queue to drain 13273 1726853317.37063: waiting for pending results... 13273 1726853317.37240: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages 13273 1726853317.37325: in run() - task 02083763-bbaf-5fc3-657d-0000000000d7 13273 1726853317.37336: variable 'ansible_search_path' from source: unknown 13273 1726853317.37339: variable 'ansible_search_path' from source: unknown 13273 1726853317.37370: calling self._execute() 13273 1726853317.37448: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853317.37455: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853317.37465: variable 'omit' from source: magic vars 13273 1726853317.37737: variable 'ansible_distribution_major_version' from source: facts 13273 1726853317.37749: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853317.37884: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13273 1726853317.38076: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13273 1726853317.38109: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13273 1726853317.38134: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13273 1726853317.38186: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13273 1726853317.38260: variable 'network_packages' from source: role '' defaults 13273 1726853317.38332: variable '__network_provider_setup' from source: role '' defaults 13273 1726853317.38340: variable '__network_service_name_default_nm' from source: role '' defaults 13273 1726853317.38389: variable '__network_service_name_default_nm' from source: role '' defaults 13273 1726853317.38397: variable '__network_packages_default_nm' from source: role '' defaults 13273 1726853317.38439: variable '__network_packages_default_nm' from source: role '' defaults 13273 1726853317.38554: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13273 1726853317.39880: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13273 1726853317.39927: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13273 1726853317.39955: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13273 1726853317.39980: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13273 1726853317.40004: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13273 1726853317.40061: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853317.40082: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853317.40099: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853317.40130: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853317.40140: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853317.40175: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853317.40191: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853317.40207: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853317.40237: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853317.40247: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853317.40393: variable '__network_packages_default_gobject_packages' from source: role '' defaults 13273 1726853317.40465: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853317.40482: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853317.40498: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853317.40522: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853317.40532: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853317.40596: variable 'ansible_python' from source: facts 13273 1726853317.40616: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 13273 1726853317.40675: variable '__network_wpa_supplicant_required' from source: role '' defaults 13273 1726853317.40727: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 13273 1726853317.40821: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853317.40837: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853317.40856: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853317.40885: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853317.40895: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853317.40926: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853317.40946: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853317.40964: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853317.40993: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853317.41003: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853317.41099: variable 'network_connections' from source: task vars 13273 1726853317.41104: variable 'controller_profile' from source: play vars 13273 1726853317.41175: variable 'controller_profile' from source: play vars 13273 1726853317.41182: variable 'controller_device' from source: play vars 13273 1726853317.41252: variable 'controller_device' from source: play vars 13273 1726853317.41261: variable 'port1_profile' from source: play vars 13273 1726853317.41331: variable 'port1_profile' from source: play vars 13273 1726853317.41339: variable 'dhcp_interface1' from source: play vars 13273 1726853317.41410: variable 'dhcp_interface1' from source: play vars 13273 1726853317.41415: variable 'controller_profile' from source: play vars 13273 1726853317.41486: variable 'controller_profile' from source: play vars 13273 1726853317.41493: variable 'port2_profile' from source: play vars 13273 1726853317.41563: variable 'port2_profile' from source: play vars 13273 1726853317.41572: variable 'dhcp_interface2' from source: play vars 13273 1726853317.41642: variable 'dhcp_interface2' from source: play vars 13273 1726853317.41645: variable 'controller_profile' from source: play vars 13273 1726853317.41713: variable 'controller_profile' from source: play vars 13273 1726853317.41766: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13273 1726853317.41787: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13273 1726853317.41806: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853317.41827: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13273 1726853317.41869: variable '__network_wireless_connections_defined' from source: role '' defaults 13273 1726853317.42045: variable 'network_connections' from source: task vars 13273 1726853317.42052: variable 'controller_profile' from source: play vars 13273 1726853317.42123: variable 'controller_profile' from source: play vars 13273 1726853317.42130: variable 'controller_device' from source: play vars 13273 1726853317.42202: variable 'controller_device' from source: play vars 13273 1726853317.42211: variable 'port1_profile' from source: play vars 13273 1726853317.42281: variable 'port1_profile' from source: play vars 13273 1726853317.42289: variable 'dhcp_interface1' from source: play vars 13273 1726853317.42356: variable 'dhcp_interface1' from source: play vars 13273 1726853317.42363: variable 'controller_profile' from source: play vars 13273 1726853317.42433: variable 'controller_profile' from source: play vars 13273 1726853317.42441: variable 'port2_profile' from source: play vars 13273 1726853317.42513: variable 'port2_profile' from source: play vars 13273 1726853317.42519: variable 'dhcp_interface2' from source: play vars 13273 1726853317.42587: variable 'dhcp_interface2' from source: play vars 13273 1726853317.42594: variable 'controller_profile' from source: play vars 13273 1726853317.42667: variable 'controller_profile' from source: play vars 13273 1726853317.42706: variable '__network_packages_default_wireless' from source: role '' defaults 13273 1726853317.42763: variable '__network_wireless_connections_defined' from source: role '' defaults 13273 1726853317.42961: variable 'network_connections' from source: task vars 13273 1726853317.42964: variable 'controller_profile' from source: play vars 13273 1726853317.43012: variable 'controller_profile' from source: play vars 13273 1726853317.43018: variable 'controller_device' from source: play vars 13273 1726853317.43068: variable 'controller_device' from source: play vars 13273 1726853317.43076: variable 'port1_profile' from source: play vars 13273 1726853317.43120: variable 'port1_profile' from source: play vars 13273 1726853317.43125: variable 'dhcp_interface1' from source: play vars 13273 1726853317.43175: variable 'dhcp_interface1' from source: play vars 13273 1726853317.43181: variable 'controller_profile' from source: play vars 13273 1726853317.43224: variable 'controller_profile' from source: play vars 13273 1726853317.43229: variable 'port2_profile' from source: play vars 13273 1726853317.43278: variable 'port2_profile' from source: play vars 13273 1726853317.43284: variable 'dhcp_interface2' from source: play vars 13273 1726853317.43327: variable 'dhcp_interface2' from source: play vars 13273 1726853317.43333: variable 'controller_profile' from source: play vars 13273 1726853317.43382: variable 'controller_profile' from source: play vars 13273 1726853317.43401: variable '__network_packages_default_team' from source: role '' defaults 13273 1726853317.43455: variable '__network_team_connections_defined' from source: role '' defaults 13273 1726853317.43649: variable 'network_connections' from source: task vars 13273 1726853317.43656: variable 'controller_profile' from source: play vars 13273 1726853317.43703: variable 'controller_profile' from source: play vars 13273 1726853317.43710: variable 'controller_device' from source: play vars 13273 1726853317.43755: variable 'controller_device' from source: play vars 13273 1726853317.43762: variable 'port1_profile' from source: play vars 13273 1726853317.43810: variable 'port1_profile' from source: play vars 13273 1726853317.43814: variable 'dhcp_interface1' from source: play vars 13273 1726853317.43861: variable 'dhcp_interface1' from source: play vars 13273 1726853317.43865: variable 'controller_profile' from source: play vars 13273 1726853317.43913: variable 'controller_profile' from source: play vars 13273 1726853317.43920: variable 'port2_profile' from source: play vars 13273 1726853317.43965: variable 'port2_profile' from source: play vars 13273 1726853317.43972: variable 'dhcp_interface2' from source: play vars 13273 1726853317.44016: variable 'dhcp_interface2' from source: play vars 13273 1726853317.44023: variable 'controller_profile' from source: play vars 13273 1726853317.44069: variable 'controller_profile' from source: play vars 13273 1726853317.44116: variable '__network_service_name_default_initscripts' from source: role '' defaults 13273 1726853317.44160: variable '__network_service_name_default_initscripts' from source: role '' defaults 13273 1726853317.44164: variable '__network_packages_default_initscripts' from source: role '' defaults 13273 1726853317.44207: variable '__network_packages_default_initscripts' from source: role '' defaults 13273 1726853317.44343: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 13273 1726853317.44679: variable 'network_connections' from source: task vars 13273 1726853317.44684: variable 'controller_profile' from source: play vars 13273 1726853317.44725: variable 'controller_profile' from source: play vars 13273 1726853317.44733: variable 'controller_device' from source: play vars 13273 1726853317.44773: variable 'controller_device' from source: play vars 13273 1726853317.44782: variable 'port1_profile' from source: play vars 13273 1726853317.44823: variable 'port1_profile' from source: play vars 13273 1726853317.44829: variable 'dhcp_interface1' from source: play vars 13273 1726853317.44870: variable 'dhcp_interface1' from source: play vars 13273 1726853317.44875: variable 'controller_profile' from source: play vars 13273 1726853317.44918: variable 'controller_profile' from source: play vars 13273 1726853317.44927: variable 'port2_profile' from source: play vars 13273 1726853317.44967: variable 'port2_profile' from source: play vars 13273 1726853317.44974: variable 'dhcp_interface2' from source: play vars 13273 1726853317.45077: variable 'dhcp_interface2' from source: play vars 13273 1726853317.45080: variable 'controller_profile' from source: play vars 13273 1726853317.45083: variable 'controller_profile' from source: play vars 13273 1726853317.45085: variable 'ansible_distribution' from source: facts 13273 1726853317.45087: variable '__network_rh_distros' from source: role '' defaults 13273 1726853317.45089: variable 'ansible_distribution_major_version' from source: facts 13273 1726853317.45091: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 13273 1726853317.45198: variable 'ansible_distribution' from source: facts 13273 1726853317.45201: variable '__network_rh_distros' from source: role '' defaults 13273 1726853317.45205: variable 'ansible_distribution_major_version' from source: facts 13273 1726853317.45218: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 13273 1726853317.45325: variable 'ansible_distribution' from source: facts 13273 1726853317.45328: variable '__network_rh_distros' from source: role '' defaults 13273 1726853317.45330: variable 'ansible_distribution_major_version' from source: facts 13273 1726853317.45356: variable 'network_provider' from source: set_fact 13273 1726853317.45367: variable 'ansible_facts' from source: unknown 13273 1726853317.45800: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 13273 1726853317.45803: when evaluation is False, skipping this task 13273 1726853317.45806: _execute() done 13273 1726853317.45808: dumping result to json 13273 1726853317.45810: done dumping result, returning 13273 1726853317.45818: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages [02083763-bbaf-5fc3-657d-0000000000d7] 13273 1726853317.45823: sending task result for task 02083763-bbaf-5fc3-657d-0000000000d7 13273 1726853317.45913: done sending task result for task 02083763-bbaf-5fc3-657d-0000000000d7 13273 1726853317.45916: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 13273 1726853317.45981: no more pending results, returning what we have 13273 1726853317.45984: results queue empty 13273 1726853317.45985: checking for any_errors_fatal 13273 1726853317.45992: done checking for any_errors_fatal 13273 1726853317.45993: checking for max_fail_percentage 13273 1726853317.45995: done checking for max_fail_percentage 13273 1726853317.45995: checking to see if all hosts have failed and the running result is not ok 13273 1726853317.45996: done checking to see if all hosts have failed 13273 1726853317.45997: getting the remaining hosts for this loop 13273 1726853317.45998: done getting the remaining hosts for this loop 13273 1726853317.46002: getting the next task for host managed_node3 13273 1726853317.46009: done getting next task for host managed_node3 13273 1726853317.46012: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 13273 1726853317.46015: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853317.46151: getting variables 13273 1726853317.46153: in VariableManager get_vars() 13273 1726853317.46250: Calling all_inventory to load vars for managed_node3 13273 1726853317.46253: Calling groups_inventory to load vars for managed_node3 13273 1726853317.46255: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853317.46264: Calling all_plugins_play to load vars for managed_node3 13273 1726853317.46267: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853317.46273: Calling groups_plugins_play to load vars for managed_node3 13273 1726853317.47443: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853317.48316: done with get_vars() 13273 1726853317.48333: done getting variables 13273 1726853317.48380: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 13:28:37 -0400 (0:00:00.116) 0:00:35.373 ****** 13273 1726853317.48405: entering _queue_task() for managed_node3/package 13273 1726853317.48651: worker is 1 (out of 1 available) 13273 1726853317.48665: exiting _queue_task() for managed_node3/package 13273 1726853317.48678: done queuing things up, now waiting for results queue to drain 13273 1726853317.48679: waiting for pending results... 13273 1726853317.48868: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 13273 1726853317.48969: in run() - task 02083763-bbaf-5fc3-657d-0000000000d8 13273 1726853317.48983: variable 'ansible_search_path' from source: unknown 13273 1726853317.48987: variable 'ansible_search_path' from source: unknown 13273 1726853317.49018: calling self._execute() 13273 1726853317.49098: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853317.49104: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853317.49112: variable 'omit' from source: magic vars 13273 1726853317.49687: variable 'ansible_distribution_major_version' from source: facts 13273 1726853317.49690: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853317.49766: variable 'network_state' from source: role '' defaults 13273 1726853317.49784: Evaluated conditional (network_state != {}): False 13273 1726853317.49792: when evaluation is False, skipping this task 13273 1726853317.49799: _execute() done 13273 1726853317.49812: dumping result to json 13273 1726853317.49821: done dumping result, returning 13273 1726853317.49834: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [02083763-bbaf-5fc3-657d-0000000000d8] 13273 1726853317.49843: sending task result for task 02083763-bbaf-5fc3-657d-0000000000d8 skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 13273 1726853317.50081: no more pending results, returning what we have 13273 1726853317.50085: results queue empty 13273 1726853317.50088: checking for any_errors_fatal 13273 1726853317.50093: done checking for any_errors_fatal 13273 1726853317.50093: checking for max_fail_percentage 13273 1726853317.50095: done checking for max_fail_percentage 13273 1726853317.50096: checking to see if all hosts have failed and the running result is not ok 13273 1726853317.50097: done checking to see if all hosts have failed 13273 1726853317.50098: getting the remaining hosts for this loop 13273 1726853317.50100: done getting the remaining hosts for this loop 13273 1726853317.50103: getting the next task for host managed_node3 13273 1726853317.50110: done getting next task for host managed_node3 13273 1726853317.50114: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 13273 1726853317.50117: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853317.50379: getting variables 13273 1726853317.50381: in VariableManager get_vars() 13273 1726853317.50431: Calling all_inventory to load vars for managed_node3 13273 1726853317.50434: Calling groups_inventory to load vars for managed_node3 13273 1726853317.50437: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853317.50450: Calling all_plugins_play to load vars for managed_node3 13273 1726853317.50454: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853317.50458: Calling groups_plugins_play to load vars for managed_node3 13273 1726853317.51064: done sending task result for task 02083763-bbaf-5fc3-657d-0000000000d8 13273 1726853317.51067: WORKER PROCESS EXITING 13273 1726853317.52049: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853317.53566: done with get_vars() 13273 1726853317.53601: done getting variables 13273 1726853317.53665: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 13:28:37 -0400 (0:00:00.052) 0:00:35.426 ****** 13273 1726853317.53710: entering _queue_task() for managed_node3/package 13273 1726853317.54119: worker is 1 (out of 1 available) 13273 1726853317.54246: exiting _queue_task() for managed_node3/package 13273 1726853317.54257: done queuing things up, now waiting for results queue to drain 13273 1726853317.54258: waiting for pending results... 13273 1726853317.54422: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 13273 1726853317.54584: in run() - task 02083763-bbaf-5fc3-657d-0000000000d9 13273 1726853317.54609: variable 'ansible_search_path' from source: unknown 13273 1726853317.54617: variable 'ansible_search_path' from source: unknown 13273 1726853317.54656: calling self._execute() 13273 1726853317.54767: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853317.54825: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853317.54839: variable 'omit' from source: magic vars 13273 1726853317.55231: variable 'ansible_distribution_major_version' from source: facts 13273 1726853317.55255: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853317.55389: variable 'network_state' from source: role '' defaults 13273 1726853317.55408: Evaluated conditional (network_state != {}): False 13273 1726853317.55416: when evaluation is False, skipping this task 13273 1726853317.55426: _execute() done 13273 1726853317.55446: dumping result to json 13273 1726853317.55467: done dumping result, returning 13273 1726853317.55882: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [02083763-bbaf-5fc3-657d-0000000000d9] 13273 1726853317.55886: sending task result for task 02083763-bbaf-5fc3-657d-0000000000d9 13273 1726853317.55964: done sending task result for task 02083763-bbaf-5fc3-657d-0000000000d9 13273 1726853317.55969: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 13273 1726853317.56020: no more pending results, returning what we have 13273 1726853317.56024: results queue empty 13273 1726853317.56025: checking for any_errors_fatal 13273 1726853317.56033: done checking for any_errors_fatal 13273 1726853317.56034: checking for max_fail_percentage 13273 1726853317.56036: done checking for max_fail_percentage 13273 1726853317.56037: checking to see if all hosts have failed and the running result is not ok 13273 1726853317.56037: done checking to see if all hosts have failed 13273 1726853317.56038: getting the remaining hosts for this loop 13273 1726853317.56040: done getting the remaining hosts for this loop 13273 1726853317.56043: getting the next task for host managed_node3 13273 1726853317.56050: done getting next task for host managed_node3 13273 1726853317.56053: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 13273 1726853317.56057: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853317.56082: getting variables 13273 1726853317.56084: in VariableManager get_vars() 13273 1726853317.56142: Calling all_inventory to load vars for managed_node3 13273 1726853317.56146: Calling groups_inventory to load vars for managed_node3 13273 1726853317.56149: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853317.56160: Calling all_plugins_play to load vars for managed_node3 13273 1726853317.56163: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853317.56167: Calling groups_plugins_play to load vars for managed_node3 13273 1726853317.58561: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853317.60380: done with get_vars() 13273 1726853317.60403: done getting variables 13273 1726853317.60467: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 13:28:37 -0400 (0:00:00.067) 0:00:35.494 ****** 13273 1726853317.60505: entering _queue_task() for managed_node3/service 13273 1726853317.60972: worker is 1 (out of 1 available) 13273 1726853317.60981: exiting _queue_task() for managed_node3/service 13273 1726853317.60992: done queuing things up, now waiting for results queue to drain 13273 1726853317.60993: waiting for pending results... 13273 1726853317.61190: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 13273 1726853317.61387: in run() - task 02083763-bbaf-5fc3-657d-0000000000da 13273 1726853317.61391: variable 'ansible_search_path' from source: unknown 13273 1726853317.61394: variable 'ansible_search_path' from source: unknown 13273 1726853317.61402: calling self._execute() 13273 1726853317.61512: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853317.61524: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853317.61539: variable 'omit' from source: magic vars 13273 1726853317.62039: variable 'ansible_distribution_major_version' from source: facts 13273 1726853317.62042: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853317.62075: variable '__network_wireless_connections_defined' from source: role '' defaults 13273 1726853317.62282: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13273 1726853317.64547: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13273 1726853317.64620: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13273 1726853317.64676: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13273 1726853317.64716: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13273 1726853317.64751: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13273 1726853317.64832: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853317.64889: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853317.64964: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853317.64973: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853317.64993: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853317.65043: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853317.65078: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853317.65107: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853317.65148: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853317.65180: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853317.65215: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853317.65276: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853317.65286: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853317.65324: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853317.65341: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853317.65527: variable 'network_connections' from source: task vars 13273 1726853317.65547: variable 'controller_profile' from source: play vars 13273 1726853317.65625: variable 'controller_profile' from source: play vars 13273 1726853317.65676: variable 'controller_device' from source: play vars 13273 1726853317.65704: variable 'controller_device' from source: play vars 13273 1726853317.65727: variable 'port1_profile' from source: play vars 13273 1726853317.65790: variable 'port1_profile' from source: play vars 13273 1726853317.65802: variable 'dhcp_interface1' from source: play vars 13273 1726853317.65869: variable 'dhcp_interface1' from source: play vars 13273 1726853317.65942: variable 'controller_profile' from source: play vars 13273 1726853317.65949: variable 'controller_profile' from source: play vars 13273 1726853317.65960: variable 'port2_profile' from source: play vars 13273 1726853317.66022: variable 'port2_profile' from source: play vars 13273 1726853317.66034: variable 'dhcp_interface2' from source: play vars 13273 1726853317.66102: variable 'dhcp_interface2' from source: play vars 13273 1726853317.66114: variable 'controller_profile' from source: play vars 13273 1726853317.66181: variable 'controller_profile' from source: play vars 13273 1726853317.66253: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13273 1726853317.66436: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13273 1726853317.66484: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13273 1726853317.66676: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13273 1726853317.66679: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13273 1726853317.66681: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13273 1726853317.66684: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13273 1726853317.66686: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853317.66687: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13273 1726853317.66745: variable '__network_team_connections_defined' from source: role '' defaults 13273 1726853317.67010: variable 'network_connections' from source: task vars 13273 1726853317.67026: variable 'controller_profile' from source: play vars 13273 1726853317.67092: variable 'controller_profile' from source: play vars 13273 1726853317.67104: variable 'controller_device' from source: play vars 13273 1726853317.67174: variable 'controller_device' from source: play vars 13273 1726853317.67249: variable 'port1_profile' from source: play vars 13273 1726853317.67252: variable 'port1_profile' from source: play vars 13273 1726853317.67262: variable 'dhcp_interface1' from source: play vars 13273 1726853317.67322: variable 'dhcp_interface1' from source: play vars 13273 1726853317.67333: variable 'controller_profile' from source: play vars 13273 1726853317.67401: variable 'controller_profile' from source: play vars 13273 1726853317.67412: variable 'port2_profile' from source: play vars 13273 1726853317.67481: variable 'port2_profile' from source: play vars 13273 1726853317.67492: variable 'dhcp_interface2' from source: play vars 13273 1726853317.67551: variable 'dhcp_interface2' from source: play vars 13273 1726853317.67562: variable 'controller_profile' from source: play vars 13273 1726853317.67627: variable 'controller_profile' from source: play vars 13273 1726853317.67662: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 13273 1726853317.67677: when evaluation is False, skipping this task 13273 1726853317.67786: _execute() done 13273 1726853317.67790: dumping result to json 13273 1726853317.67792: done dumping result, returning 13273 1726853317.67794: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [02083763-bbaf-5fc3-657d-0000000000da] 13273 1726853317.67796: sending task result for task 02083763-bbaf-5fc3-657d-0000000000da 13273 1726853317.67864: done sending task result for task 02083763-bbaf-5fc3-657d-0000000000da 13273 1726853317.67867: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 13273 1726853317.67937: no more pending results, returning what we have 13273 1726853317.67940: results queue empty 13273 1726853317.67941: checking for any_errors_fatal 13273 1726853317.67951: done checking for any_errors_fatal 13273 1726853317.67952: checking for max_fail_percentage 13273 1726853317.67954: done checking for max_fail_percentage 13273 1726853317.67954: checking to see if all hosts have failed and the running result is not ok 13273 1726853317.67955: done checking to see if all hosts have failed 13273 1726853317.67956: getting the remaining hosts for this loop 13273 1726853317.67958: done getting the remaining hosts for this loop 13273 1726853317.67961: getting the next task for host managed_node3 13273 1726853317.67967: done getting next task for host managed_node3 13273 1726853317.67976: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 13273 1726853317.67980: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853317.68003: getting variables 13273 1726853317.68004: in VariableManager get_vars() 13273 1726853317.68061: Calling all_inventory to load vars for managed_node3 13273 1726853317.68064: Calling groups_inventory to load vars for managed_node3 13273 1726853317.68066: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853317.68181: Calling all_plugins_play to load vars for managed_node3 13273 1726853317.68185: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853317.68189: Calling groups_plugins_play to load vars for managed_node3 13273 1726853317.69730: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853317.71361: done with get_vars() 13273 1726853317.71387: done getting variables 13273 1726853317.71452: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 13:28:37 -0400 (0:00:00.109) 0:00:35.603 ****** 13273 1726853317.71486: entering _queue_task() for managed_node3/service 13273 1726853317.71835: worker is 1 (out of 1 available) 13273 1726853317.71847: exiting _queue_task() for managed_node3/service 13273 1726853317.71975: done queuing things up, now waiting for results queue to drain 13273 1726853317.71977: waiting for pending results... 13273 1726853317.72294: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 13273 1726853317.72311: in run() - task 02083763-bbaf-5fc3-657d-0000000000db 13273 1726853317.72332: variable 'ansible_search_path' from source: unknown 13273 1726853317.72340: variable 'ansible_search_path' from source: unknown 13273 1726853317.72383: calling self._execute() 13273 1726853317.72491: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853317.72513: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853317.72526: variable 'omit' from source: magic vars 13273 1726853317.72949: variable 'ansible_distribution_major_version' from source: facts 13273 1726853317.72953: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853317.73086: variable 'network_provider' from source: set_fact 13273 1726853317.73096: variable 'network_state' from source: role '' defaults 13273 1726853317.73108: Evaluated conditional (network_provider == "nm" or network_state != {}): True 13273 1726853317.73118: variable 'omit' from source: magic vars 13273 1726853317.73185: variable 'omit' from source: magic vars 13273 1726853317.73218: variable 'network_service_name' from source: role '' defaults 13273 1726853317.73291: variable 'network_service_name' from source: role '' defaults 13273 1726853317.73403: variable '__network_provider_setup' from source: role '' defaults 13273 1726853317.73414: variable '__network_service_name_default_nm' from source: role '' defaults 13273 1726853317.73676: variable '__network_service_name_default_nm' from source: role '' defaults 13273 1726853317.73679: variable '__network_packages_default_nm' from source: role '' defaults 13273 1726853317.73682: variable '__network_packages_default_nm' from source: role '' defaults 13273 1726853317.73784: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13273 1726853317.75922: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13273 1726853317.75995: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13273 1726853317.76037: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13273 1726853317.76095: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13273 1726853317.76126: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13273 1726853317.76216: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853317.76250: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853317.76281: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853317.76331: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853317.76354: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853317.76410: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853317.76440: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853317.76466: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853317.76512: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853317.76676: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853317.76774: variable '__network_packages_default_gobject_packages' from source: role '' defaults 13273 1726853317.76899: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853317.76930: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853317.76958: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853317.77001: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853317.77028: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853317.77123: variable 'ansible_python' from source: facts 13273 1726853317.77153: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 13273 1726853317.77243: variable '__network_wpa_supplicant_required' from source: role '' defaults 13273 1726853317.77322: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 13273 1726853317.77557: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853317.77560: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853317.77562: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853317.77564: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853317.77566: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853317.77613: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853317.77649: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853317.77685: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853317.77725: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853317.77741: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853317.77876: variable 'network_connections' from source: task vars 13273 1726853317.77894: variable 'controller_profile' from source: play vars 13273 1726853317.77964: variable 'controller_profile' from source: play vars 13273 1726853317.77987: variable 'controller_device' from source: play vars 13273 1726853317.78060: variable 'controller_device' from source: play vars 13273 1726853317.78081: variable 'port1_profile' from source: play vars 13273 1726853317.78159: variable 'port1_profile' from source: play vars 13273 1726853317.78179: variable 'dhcp_interface1' from source: play vars 13273 1726853317.78257: variable 'dhcp_interface1' from source: play vars 13273 1726853317.78315: variable 'controller_profile' from source: play vars 13273 1726853317.78356: variable 'controller_profile' from source: play vars 13273 1726853317.78376: variable 'port2_profile' from source: play vars 13273 1726853317.78455: variable 'port2_profile' from source: play vars 13273 1726853317.78474: variable 'dhcp_interface2' from source: play vars 13273 1726853317.78550: variable 'dhcp_interface2' from source: play vars 13273 1726853317.78640: variable 'controller_profile' from source: play vars 13273 1726853317.78644: variable 'controller_profile' from source: play vars 13273 1726853317.78750: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13273 1726853317.78944: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13273 1726853317.79010: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13273 1726853317.79434: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13273 1726853317.79514: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13273 1726853317.79549: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13273 1726853317.79584: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13273 1726853317.79626: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853317.79664: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13273 1726853317.79732: variable '__network_wireless_connections_defined' from source: role '' defaults 13273 1726853317.80007: variable 'network_connections' from source: task vars 13273 1726853317.80059: variable 'controller_profile' from source: play vars 13273 1726853317.80107: variable 'controller_profile' from source: play vars 13273 1726853317.80124: variable 'controller_device' from source: play vars 13273 1726853317.80203: variable 'controller_device' from source: play vars 13273 1726853317.80220: variable 'port1_profile' from source: play vars 13273 1726853317.80299: variable 'port1_profile' from source: play vars 13273 1726853317.80387: variable 'dhcp_interface1' from source: play vars 13273 1726853317.80390: variable 'dhcp_interface1' from source: play vars 13273 1726853317.80401: variable 'controller_profile' from source: play vars 13273 1726853317.80469: variable 'controller_profile' from source: play vars 13273 1726853317.80486: variable 'port2_profile' from source: play vars 13273 1726853317.80554: variable 'port2_profile' from source: play vars 13273 1726853317.80568: variable 'dhcp_interface2' from source: play vars 13273 1726853317.80641: variable 'dhcp_interface2' from source: play vars 13273 1726853317.80657: variable 'controller_profile' from source: play vars 13273 1726853317.80777: variable 'controller_profile' from source: play vars 13273 1726853317.80784: variable '__network_packages_default_wireless' from source: role '' defaults 13273 1726853317.80861: variable '__network_wireless_connections_defined' from source: role '' defaults 13273 1726853317.81135: variable 'network_connections' from source: task vars 13273 1726853317.81152: variable 'controller_profile' from source: play vars 13273 1726853317.81225: variable 'controller_profile' from source: play vars 13273 1726853317.81237: variable 'controller_device' from source: play vars 13273 1726853317.81317: variable 'controller_device' from source: play vars 13273 1726853317.81331: variable 'port1_profile' from source: play vars 13273 1726853317.81476: variable 'port1_profile' from source: play vars 13273 1726853317.81479: variable 'dhcp_interface1' from source: play vars 13273 1726853317.81495: variable 'dhcp_interface1' from source: play vars 13273 1726853317.81508: variable 'controller_profile' from source: play vars 13273 1726853317.81584: variable 'controller_profile' from source: play vars 13273 1726853317.81597: variable 'port2_profile' from source: play vars 13273 1726853317.81666: variable 'port2_profile' from source: play vars 13273 1726853317.81687: variable 'dhcp_interface2' from source: play vars 13273 1726853317.81758: variable 'dhcp_interface2' from source: play vars 13273 1726853317.81798: variable 'controller_profile' from source: play vars 13273 1726853317.81995: variable 'controller_profile' from source: play vars 13273 1726853317.81998: variable '__network_packages_default_team' from source: role '' defaults 13273 1726853317.82000: variable '__network_team_connections_defined' from source: role '' defaults 13273 1726853317.82289: variable 'network_connections' from source: task vars 13273 1726853317.82300: variable 'controller_profile' from source: play vars 13273 1726853317.82381: variable 'controller_profile' from source: play vars 13273 1726853317.82394: variable 'controller_device' from source: play vars 13273 1726853317.82473: variable 'controller_device' from source: play vars 13273 1726853317.82489: variable 'port1_profile' from source: play vars 13273 1726853317.82564: variable 'port1_profile' from source: play vars 13273 1726853317.82581: variable 'dhcp_interface1' from source: play vars 13273 1726853317.82660: variable 'dhcp_interface1' from source: play vars 13273 1726853317.82674: variable 'controller_profile' from source: play vars 13273 1726853317.82744: variable 'controller_profile' from source: play vars 13273 1726853317.82764: variable 'port2_profile' from source: play vars 13273 1726853317.82836: variable 'port2_profile' from source: play vars 13273 1726853317.82849: variable 'dhcp_interface2' from source: play vars 13273 1726853317.82929: variable 'dhcp_interface2' from source: play vars 13273 1726853317.82941: variable 'controller_profile' from source: play vars 13273 1726853317.83022: variable 'controller_profile' from source: play vars 13273 1726853317.83101: variable '__network_service_name_default_initscripts' from source: role '' defaults 13273 1726853317.83167: variable '__network_service_name_default_initscripts' from source: role '' defaults 13273 1726853317.83183: variable '__network_packages_default_initscripts' from source: role '' defaults 13273 1726853317.83251: variable '__network_packages_default_initscripts' from source: role '' defaults 13273 1726853317.83484: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 13273 1726853317.84000: variable 'network_connections' from source: task vars 13273 1726853317.84011: variable 'controller_profile' from source: play vars 13273 1726853317.84177: variable 'controller_profile' from source: play vars 13273 1726853317.84181: variable 'controller_device' from source: play vars 13273 1726853317.84183: variable 'controller_device' from source: play vars 13273 1726853317.84185: variable 'port1_profile' from source: play vars 13273 1726853317.84231: variable 'port1_profile' from source: play vars 13273 1726853317.84243: variable 'dhcp_interface1' from source: play vars 13273 1726853317.84313: variable 'dhcp_interface1' from source: play vars 13273 1726853317.84324: variable 'controller_profile' from source: play vars 13273 1726853317.84386: variable 'controller_profile' from source: play vars 13273 1726853317.84399: variable 'port2_profile' from source: play vars 13273 1726853317.84465: variable 'port2_profile' from source: play vars 13273 1726853317.84481: variable 'dhcp_interface2' from source: play vars 13273 1726853317.84576: variable 'dhcp_interface2' from source: play vars 13273 1726853317.84579: variable 'controller_profile' from source: play vars 13273 1726853317.84631: variable 'controller_profile' from source: play vars 13273 1726853317.84645: variable 'ansible_distribution' from source: facts 13273 1726853317.84655: variable '__network_rh_distros' from source: role '' defaults 13273 1726853317.84665: variable 'ansible_distribution_major_version' from source: facts 13273 1726853317.84740: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 13273 1726853317.84884: variable 'ansible_distribution' from source: facts 13273 1726853317.84893: variable '__network_rh_distros' from source: role '' defaults 13273 1726853317.84904: variable 'ansible_distribution_major_version' from source: facts 13273 1726853317.84925: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 13273 1726853317.85109: variable 'ansible_distribution' from source: facts 13273 1726853317.85119: variable '__network_rh_distros' from source: role '' defaults 13273 1726853317.85129: variable 'ansible_distribution_major_version' from source: facts 13273 1726853317.85376: variable 'network_provider' from source: set_fact 13273 1726853317.85380: variable 'omit' from source: magic vars 13273 1726853317.85382: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13273 1726853317.85385: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13273 1726853317.85387: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13273 1726853317.85390: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853317.85392: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853317.85394: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13273 1726853317.85396: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853317.85398: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853317.85467: Set connection var ansible_connection to ssh 13273 1726853317.85486: Set connection var ansible_module_compression to ZIP_DEFLATED 13273 1726853317.85497: Set connection var ansible_shell_executable to /bin/sh 13273 1726853317.85504: Set connection var ansible_shell_type to sh 13273 1726853317.85523: Set connection var ansible_pipelining to False 13273 1726853317.85533: Set connection var ansible_timeout to 10 13273 1726853317.85567: variable 'ansible_shell_executable' from source: unknown 13273 1726853317.85578: variable 'ansible_connection' from source: unknown 13273 1726853317.85586: variable 'ansible_module_compression' from source: unknown 13273 1726853317.85593: variable 'ansible_shell_type' from source: unknown 13273 1726853317.85600: variable 'ansible_shell_executable' from source: unknown 13273 1726853317.85607: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853317.85620: variable 'ansible_pipelining' from source: unknown 13273 1726853317.85632: variable 'ansible_timeout' from source: unknown 13273 1726853317.85640: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853317.85757: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13273 1726853317.85775: variable 'omit' from source: magic vars 13273 1726853317.85843: starting attempt loop 13273 1726853317.85846: running the handler 13273 1726853317.85879: variable 'ansible_facts' from source: unknown 13273 1726853317.86646: _low_level_execute_command(): starting 13273 1726853317.86659: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13273 1726853317.87487: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13273 1726853317.87493: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853317.87507: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853317.87610: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853317.89364: stdout chunk (state=3): >>>/root <<< 13273 1726853317.89507: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853317.89524: stdout chunk (state=3): >>><<< 13273 1726853317.89536: stderr chunk (state=3): >>><<< 13273 1726853317.89567: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853317.89677: _low_level_execute_command(): starting 13273 1726853317.89681: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853317.8958225-14927-187449985441095 `" && echo ansible-tmp-1726853317.8958225-14927-187449985441095="` echo /root/.ansible/tmp/ansible-tmp-1726853317.8958225-14927-187449985441095 `" ) && sleep 0' 13273 1726853317.90283: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853317.90300: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853317.90403: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853317.92400: stdout chunk (state=3): >>>ansible-tmp-1726853317.8958225-14927-187449985441095=/root/.ansible/tmp/ansible-tmp-1726853317.8958225-14927-187449985441095 <<< 13273 1726853317.92507: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853317.92539: stderr chunk (state=3): >>><<< 13273 1726853317.92541: stdout chunk (state=3): >>><<< 13273 1726853317.92553: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853317.8958225-14927-187449985441095=/root/.ansible/tmp/ansible-tmp-1726853317.8958225-14927-187449985441095 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853317.92600: variable 'ansible_module_compression' from source: unknown 13273 1726853317.92639: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13273pukop8ph/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 13273 1726853317.92691: variable 'ansible_facts' from source: unknown 13273 1726853317.92819: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853317.8958225-14927-187449985441095/AnsiballZ_systemd.py 13273 1726853317.92919: Sending initial data 13273 1726853317.92923: Sent initial data (156 bytes) 13273 1726853317.93345: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853317.93383: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13273 1726853317.93386: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853317.93390: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address <<< 13273 1726853317.93392: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 13273 1726853317.93394: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853317.93437: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853317.93440: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853317.93504: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853317.95160: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13273 1726853317.95225: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13273 1726853317.95285: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13273pukop8ph/tmpip158s4g /root/.ansible/tmp/ansible-tmp-1726853317.8958225-14927-187449985441095/AnsiballZ_systemd.py <<< 13273 1726853317.95288: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853317.8958225-14927-187449985441095/AnsiballZ_systemd.py" <<< 13273 1726853317.95348: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13273pukop8ph/tmpip158s4g" to remote "/root/.ansible/tmp/ansible-tmp-1726853317.8958225-14927-187449985441095/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853317.8958225-14927-187449985441095/AnsiballZ_systemd.py" <<< 13273 1726853317.96534: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853317.96575: stderr chunk (state=3): >>><<< 13273 1726853317.96579: stdout chunk (state=3): >>><<< 13273 1726853317.96618: done transferring module to remote 13273 1726853317.96630: _low_level_execute_command(): starting 13273 1726853317.96633: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853317.8958225-14927-187449985441095/ /root/.ansible/tmp/ansible-tmp-1726853317.8958225-14927-187449985441095/AnsiballZ_systemd.py && sleep 0' 13273 1726853317.97079: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853317.97086: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13273 1726853317.97088: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853317.97090: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration <<< 13273 1726853317.97093: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853317.97133: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853317.97137: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853317.97200: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853317.99059: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853317.99082: stderr chunk (state=3): >>><<< 13273 1726853317.99086: stdout chunk (state=3): >>><<< 13273 1726853317.99098: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853317.99101: _low_level_execute_command(): starting 13273 1726853317.99103: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853317.8958225-14927-187449985441095/AnsiballZ_systemd.py && sleep 0' 13273 1726853317.99524: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853317.99527: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 13273 1726853317.99530: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address <<< 13273 1726853317.99532: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13273 1726853317.99534: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853317.99577: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853317.99590: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853317.99657: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853318.29341: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "705", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:21:20 EDT", "ExecMainStartTimestampMonotonic": "24298536", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:21:20 EDT", "ExecMainHandoffTimestampMonotonic": "24318182", "ExecMainPID": "705", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10465280", "MemoryPeak": "14114816", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3314671616", "EffectiveMemoryMax": "3702865920", "EffectiveMemoryHigh": "3702865920", "CPUUsageNSec": "1100600000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpR<<< 13273 1726853318.29368: stdout chunk (state=3): >>>eceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice sysinit.target dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "network.target shutdown.target cloud-init.service NetworkManager-wait-online.service multi-user.target", "After": "sysini<<< 13273 1726853318.29381: stdout chunk (state=3): >>>t.target systemd-journald.socket basic.target cloud-init-local.service network-pre.target dbus.socket system.slice dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:26:57 EDT", "StateChangeTimestampMonotonic": "361843458", "InactiveExitTimestamp": "Fri 2024-09-20 13:21:20 EDT", "InactiveExitTimestampMonotonic": "24299070", "ActiveEnterTimestamp": "Fri 2024-09-20 13:21:20 EDT", "ActiveEnterTimestampMonotonic": "24855925", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:21:20 EDT", "ConditionTimestampMonotonic": "24297535", "AssertTimestamp": "Fri 2024-09-20 13:21:20 EDT", "AssertTimestampMonotonic": "24297537", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "125a1bdc44cb4bffa8aeca788d2f2fa3", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 13273 1726853318.31356: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 13273 1726853318.31389: stderr chunk (state=3): >>><<< 13273 1726853318.31392: stdout chunk (state=3): >>><<< 13273 1726853318.31411: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "705", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:21:20 EDT", "ExecMainStartTimestampMonotonic": "24298536", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:21:20 EDT", "ExecMainHandoffTimestampMonotonic": "24318182", "ExecMainPID": "705", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10465280", "MemoryPeak": "14114816", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3314671616", "EffectiveMemoryMax": "3702865920", "EffectiveMemoryHigh": "3702865920", "CPUUsageNSec": "1100600000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice sysinit.target dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "network.target shutdown.target cloud-init.service NetworkManager-wait-online.service multi-user.target", "After": "sysinit.target systemd-journald.socket basic.target cloud-init-local.service network-pre.target dbus.socket system.slice dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:26:57 EDT", "StateChangeTimestampMonotonic": "361843458", "InactiveExitTimestamp": "Fri 2024-09-20 13:21:20 EDT", "InactiveExitTimestampMonotonic": "24299070", "ActiveEnterTimestamp": "Fri 2024-09-20 13:21:20 EDT", "ActiveEnterTimestampMonotonic": "24855925", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:21:20 EDT", "ConditionTimestampMonotonic": "24297535", "AssertTimestamp": "Fri 2024-09-20 13:21:20 EDT", "AssertTimestampMonotonic": "24297537", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "125a1bdc44cb4bffa8aeca788d2f2fa3", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 13273 1726853318.31521: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853317.8958225-14927-187449985441095/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13273 1726853318.31538: _low_level_execute_command(): starting 13273 1726853318.31543: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853317.8958225-14927-187449985441095/ > /dev/null 2>&1 && sleep 0' 13273 1726853318.31976: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853318.31980: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 13273 1726853318.31995: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853318.32053: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853318.32059: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853318.32062: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853318.32117: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853318.34005: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853318.34032: stderr chunk (state=3): >>><<< 13273 1726853318.34036: stdout chunk (state=3): >>><<< 13273 1726853318.34049: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853318.34052: handler run complete 13273 1726853318.34095: attempt loop complete, returning result 13273 1726853318.34098: _execute() done 13273 1726853318.34100: dumping result to json 13273 1726853318.34112: done dumping result, returning 13273 1726853318.34121: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [02083763-bbaf-5fc3-657d-0000000000db] 13273 1726853318.34126: sending task result for task 02083763-bbaf-5fc3-657d-0000000000db 13273 1726853318.34318: done sending task result for task 02083763-bbaf-5fc3-657d-0000000000db 13273 1726853318.34321: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 13273 1726853318.34382: no more pending results, returning what we have 13273 1726853318.34386: results queue empty 13273 1726853318.34386: checking for any_errors_fatal 13273 1726853318.34392: done checking for any_errors_fatal 13273 1726853318.34392: checking for max_fail_percentage 13273 1726853318.34394: done checking for max_fail_percentage 13273 1726853318.34395: checking to see if all hosts have failed and the running result is not ok 13273 1726853318.34395: done checking to see if all hosts have failed 13273 1726853318.34396: getting the remaining hosts for this loop 13273 1726853318.34398: done getting the remaining hosts for this loop 13273 1726853318.34401: getting the next task for host managed_node3 13273 1726853318.34406: done getting next task for host managed_node3 13273 1726853318.34409: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 13273 1726853318.34412: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853318.34424: getting variables 13273 1726853318.34426: in VariableManager get_vars() 13273 1726853318.34512: Calling all_inventory to load vars for managed_node3 13273 1726853318.34515: Calling groups_inventory to load vars for managed_node3 13273 1726853318.34517: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853318.34526: Calling all_plugins_play to load vars for managed_node3 13273 1726853318.34529: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853318.34531: Calling groups_plugins_play to load vars for managed_node3 13273 1726853318.35433: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853318.36274: done with get_vars() 13273 1726853318.36289: done getting variables 13273 1726853318.36333: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 13:28:38 -0400 (0:00:00.648) 0:00:36.252 ****** 13273 1726853318.36359: entering _queue_task() for managed_node3/service 13273 1726853318.36606: worker is 1 (out of 1 available) 13273 1726853318.36619: exiting _queue_task() for managed_node3/service 13273 1726853318.36632: done queuing things up, now waiting for results queue to drain 13273 1726853318.36633: waiting for pending results... 13273 1726853318.36833: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 13273 1726853318.36919: in run() - task 02083763-bbaf-5fc3-657d-0000000000dc 13273 1726853318.36931: variable 'ansible_search_path' from source: unknown 13273 1726853318.36935: variable 'ansible_search_path' from source: unknown 13273 1726853318.36962: calling self._execute() 13273 1726853318.37042: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853318.37049: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853318.37057: variable 'omit' from source: magic vars 13273 1726853318.37332: variable 'ansible_distribution_major_version' from source: facts 13273 1726853318.37342: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853318.37420: variable 'network_provider' from source: set_fact 13273 1726853318.37424: Evaluated conditional (network_provider == "nm"): True 13273 1726853318.37487: variable '__network_wpa_supplicant_required' from source: role '' defaults 13273 1726853318.37550: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 13273 1726853318.37663: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13273 1726853318.39059: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13273 1726853318.39105: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13273 1726853318.39130: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13273 1726853318.39160: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13273 1726853318.39183: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13273 1726853318.39257: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853318.39274: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853318.39292: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853318.39317: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853318.39327: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853318.39364: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853318.39382: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853318.39399: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853318.39424: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853318.39435: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853318.39465: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853318.39484: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853318.39500: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853318.39523: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853318.39533: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853318.39630: variable 'network_connections' from source: task vars 13273 1726853318.39640: variable 'controller_profile' from source: play vars 13273 1726853318.39689: variable 'controller_profile' from source: play vars 13273 1726853318.39698: variable 'controller_device' from source: play vars 13273 1726853318.39738: variable 'controller_device' from source: play vars 13273 1726853318.39749: variable 'port1_profile' from source: play vars 13273 1726853318.39790: variable 'port1_profile' from source: play vars 13273 1726853318.39799: variable 'dhcp_interface1' from source: play vars 13273 1726853318.39838: variable 'dhcp_interface1' from source: play vars 13273 1726853318.39843: variable 'controller_profile' from source: play vars 13273 1726853318.39886: variable 'controller_profile' from source: play vars 13273 1726853318.39892: variable 'port2_profile' from source: play vars 13273 1726853318.39937: variable 'port2_profile' from source: play vars 13273 1726853318.39943: variable 'dhcp_interface2' from source: play vars 13273 1726853318.39987: variable 'dhcp_interface2' from source: play vars 13273 1726853318.39993: variable 'controller_profile' from source: play vars 13273 1726853318.40035: variable 'controller_profile' from source: play vars 13273 1726853318.40087: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13273 1726853318.40283: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13273 1726853318.40302: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13273 1726853318.40324: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13273 1726853318.40348: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13273 1726853318.40381: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13273 1726853318.40398: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13273 1726853318.40415: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853318.40436: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13273 1726853318.40477: variable '__network_wireless_connections_defined' from source: role '' defaults 13273 1726853318.40639: variable 'network_connections' from source: task vars 13273 1726853318.40642: variable 'controller_profile' from source: play vars 13273 1726853318.40688: variable 'controller_profile' from source: play vars 13273 1726853318.40692: variable 'controller_device' from source: play vars 13273 1726853318.40734: variable 'controller_device' from source: play vars 13273 1726853318.40741: variable 'port1_profile' from source: play vars 13273 1726853318.40790: variable 'port1_profile' from source: play vars 13273 1726853318.40794: variable 'dhcp_interface1' from source: play vars 13273 1726853318.40841: variable 'dhcp_interface1' from source: play vars 13273 1726853318.40849: variable 'controller_profile' from source: play vars 13273 1726853318.40890: variable 'controller_profile' from source: play vars 13273 1726853318.40896: variable 'port2_profile' from source: play vars 13273 1726853318.40938: variable 'port2_profile' from source: play vars 13273 1726853318.40943: variable 'dhcp_interface2' from source: play vars 13273 1726853318.40985: variable 'dhcp_interface2' from source: play vars 13273 1726853318.40991: variable 'controller_profile' from source: play vars 13273 1726853318.41032: variable 'controller_profile' from source: play vars 13273 1726853318.41063: Evaluated conditional (__network_wpa_supplicant_required): False 13273 1726853318.41066: when evaluation is False, skipping this task 13273 1726853318.41069: _execute() done 13273 1726853318.41073: dumping result to json 13273 1726853318.41075: done dumping result, returning 13273 1726853318.41082: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [02083763-bbaf-5fc3-657d-0000000000dc] 13273 1726853318.41087: sending task result for task 02083763-bbaf-5fc3-657d-0000000000dc 13273 1726853318.41179: done sending task result for task 02083763-bbaf-5fc3-657d-0000000000dc 13273 1726853318.41183: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 13273 1726853318.41238: no more pending results, returning what we have 13273 1726853318.41241: results queue empty 13273 1726853318.41242: checking for any_errors_fatal 13273 1726853318.41263: done checking for any_errors_fatal 13273 1726853318.41264: checking for max_fail_percentage 13273 1726853318.41266: done checking for max_fail_percentage 13273 1726853318.41266: checking to see if all hosts have failed and the running result is not ok 13273 1726853318.41267: done checking to see if all hosts have failed 13273 1726853318.41267: getting the remaining hosts for this loop 13273 1726853318.41269: done getting the remaining hosts for this loop 13273 1726853318.41274: getting the next task for host managed_node3 13273 1726853318.41281: done getting next task for host managed_node3 13273 1726853318.41284: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 13273 1726853318.41287: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853318.41307: getting variables 13273 1726853318.41309: in VariableManager get_vars() 13273 1726853318.41357: Calling all_inventory to load vars for managed_node3 13273 1726853318.41360: Calling groups_inventory to load vars for managed_node3 13273 1726853318.41362: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853318.41370: Calling all_plugins_play to load vars for managed_node3 13273 1726853318.41378: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853318.41381: Calling groups_plugins_play to load vars for managed_node3 13273 1726853318.42591: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853318.44589: done with get_vars() 13273 1726853318.44688: done getting variables 13273 1726853318.44758: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 13:28:38 -0400 (0:00:00.084) 0:00:36.337 ****** 13273 1726853318.44880: entering _queue_task() for managed_node3/service 13273 1726853318.45433: worker is 1 (out of 1 available) 13273 1726853318.45459: exiting _queue_task() for managed_node3/service 13273 1726853318.45679: done queuing things up, now waiting for results queue to drain 13273 1726853318.45682: waiting for pending results... 13273 1726853318.46077: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service 13273 1726853318.46306: in run() - task 02083763-bbaf-5fc3-657d-0000000000dd 13273 1726853318.46597: variable 'ansible_search_path' from source: unknown 13273 1726853318.46604: variable 'ansible_search_path' from source: unknown 13273 1726853318.46608: calling self._execute() 13273 1726853318.46920: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853318.47350: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853318.47357: variable 'omit' from source: magic vars 13273 1726853318.48421: variable 'ansible_distribution_major_version' from source: facts 13273 1726853318.48461: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853318.48816: variable 'network_provider' from source: set_fact 13273 1726853318.48866: Evaluated conditional (network_provider == "initscripts"): False 13273 1726853318.48934: when evaluation is False, skipping this task 13273 1726853318.48981: _execute() done 13273 1726853318.49010: dumping result to json 13273 1726853318.49015: done dumping result, returning 13273 1726853318.49027: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service [02083763-bbaf-5fc3-657d-0000000000dd] 13273 1726853318.49104: sending task result for task 02083763-bbaf-5fc3-657d-0000000000dd 13273 1726853318.49600: done sending task result for task 02083763-bbaf-5fc3-657d-0000000000dd 13273 1726853318.49607: WORKER PROCESS EXITING skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 13273 1726853318.49700: no more pending results, returning what we have 13273 1726853318.49704: results queue empty 13273 1726853318.49705: checking for any_errors_fatal 13273 1726853318.49712: done checking for any_errors_fatal 13273 1726853318.49713: checking for max_fail_percentage 13273 1726853318.49715: done checking for max_fail_percentage 13273 1726853318.49715: checking to see if all hosts have failed and the running result is not ok 13273 1726853318.49716: done checking to see if all hosts have failed 13273 1726853318.49717: getting the remaining hosts for this loop 13273 1726853318.49719: done getting the remaining hosts for this loop 13273 1726853318.49722: getting the next task for host managed_node3 13273 1726853318.49729: done getting next task for host managed_node3 13273 1726853318.49733: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 13273 1726853318.49737: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853318.49762: getting variables 13273 1726853318.49764: in VariableManager get_vars() 13273 1726853318.49827: Calling all_inventory to load vars for managed_node3 13273 1726853318.49830: Calling groups_inventory to load vars for managed_node3 13273 1726853318.49832: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853318.49843: Calling all_plugins_play to load vars for managed_node3 13273 1726853318.49846: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853318.49852: Calling groups_plugins_play to load vars for managed_node3 13273 1726853318.51493: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853318.53487: done with get_vars() 13273 1726853318.53513: done getting variables 13273 1726853318.53582: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 13:28:38 -0400 (0:00:00.087) 0:00:36.425 ****** 13273 1726853318.53618: entering _queue_task() for managed_node3/copy 13273 1726853318.54000: worker is 1 (out of 1 available) 13273 1726853318.54012: exiting _queue_task() for managed_node3/copy 13273 1726853318.54023: done queuing things up, now waiting for results queue to drain 13273 1726853318.54024: waiting for pending results... 13273 1726853318.54398: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 13273 1726853318.54462: in run() - task 02083763-bbaf-5fc3-657d-0000000000de 13273 1726853318.54485: variable 'ansible_search_path' from source: unknown 13273 1726853318.54497: variable 'ansible_search_path' from source: unknown 13273 1726853318.54537: calling self._execute() 13273 1726853318.54654: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853318.54667: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853318.54712: variable 'omit' from source: magic vars 13273 1726853318.55087: variable 'ansible_distribution_major_version' from source: facts 13273 1726853318.55107: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853318.55222: variable 'network_provider' from source: set_fact 13273 1726853318.55255: Evaluated conditional (network_provider == "initscripts"): False 13273 1726853318.55258: when evaluation is False, skipping this task 13273 1726853318.55260: _execute() done 13273 1726853318.55262: dumping result to json 13273 1726853318.55264: done dumping result, returning 13273 1726853318.55267: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [02083763-bbaf-5fc3-657d-0000000000de] 13273 1726853318.55274: sending task result for task 02083763-bbaf-5fc3-657d-0000000000de 13273 1726853318.55548: done sending task result for task 02083763-bbaf-5fc3-657d-0000000000de 13273 1726853318.55551: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 13273 1726853318.55602: no more pending results, returning what we have 13273 1726853318.55606: results queue empty 13273 1726853318.55607: checking for any_errors_fatal 13273 1726853318.55613: done checking for any_errors_fatal 13273 1726853318.55614: checking for max_fail_percentage 13273 1726853318.55616: done checking for max_fail_percentage 13273 1726853318.55616: checking to see if all hosts have failed and the running result is not ok 13273 1726853318.55617: done checking to see if all hosts have failed 13273 1726853318.55618: getting the remaining hosts for this loop 13273 1726853318.55620: done getting the remaining hosts for this loop 13273 1726853318.55623: getting the next task for host managed_node3 13273 1726853318.55630: done getting next task for host managed_node3 13273 1726853318.55634: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 13273 1726853318.55637: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853318.55664: getting variables 13273 1726853318.55666: in VariableManager get_vars() 13273 1726853318.55725: Calling all_inventory to load vars for managed_node3 13273 1726853318.55728: Calling groups_inventory to load vars for managed_node3 13273 1726853318.55732: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853318.55743: Calling all_plugins_play to load vars for managed_node3 13273 1726853318.55749: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853318.55753: Calling groups_plugins_play to load vars for managed_node3 13273 1726853318.57185: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853318.59966: done with get_vars() 13273 1726853318.60094: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 13:28:38 -0400 (0:00:00.065) 0:00:36.490 ****** 13273 1726853318.60180: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 13273 1726853318.60965: worker is 1 (out of 1 available) 13273 1726853318.60981: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 13273 1726853318.60993: done queuing things up, now waiting for results queue to drain 13273 1726853318.60994: waiting for pending results... 13273 1726853318.61616: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 13273 1726853318.61831: in run() - task 02083763-bbaf-5fc3-657d-0000000000df 13273 1726853318.61977: variable 'ansible_search_path' from source: unknown 13273 1726853318.61981: variable 'ansible_search_path' from source: unknown 13273 1726853318.61984: calling self._execute() 13273 1726853318.62255: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853318.62273: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853318.62290: variable 'omit' from source: magic vars 13273 1726853318.63340: variable 'ansible_distribution_major_version' from source: facts 13273 1726853318.63349: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853318.63353: variable 'omit' from source: magic vars 13273 1726853318.63453: variable 'omit' from source: magic vars 13273 1726853318.63737: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13273 1726853318.73629: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13273 1726853318.73703: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13273 1726853318.73813: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13273 1726853318.73911: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13273 1726853318.73940: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13273 1726853318.74046: variable 'network_provider' from source: set_fact 13273 1726853318.74428: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853318.74481: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853318.74597: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853318.74643: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853318.74689: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853318.74783: variable 'omit' from source: magic vars 13273 1726853318.74894: variable 'omit' from source: magic vars 13273 1726853318.74998: variable 'network_connections' from source: task vars 13273 1726853318.75013: variable 'controller_profile' from source: play vars 13273 1726853318.75076: variable 'controller_profile' from source: play vars 13273 1726853318.75090: variable 'controller_device' from source: play vars 13273 1726853318.75149: variable 'controller_device' from source: play vars 13273 1726853318.75164: variable 'port1_profile' from source: play vars 13273 1726853318.75224: variable 'port1_profile' from source: play vars 13273 1726853318.75237: variable 'dhcp_interface1' from source: play vars 13273 1726853318.75377: variable 'dhcp_interface1' from source: play vars 13273 1726853318.75380: variable 'controller_profile' from source: play vars 13273 1726853318.75383: variable 'controller_profile' from source: play vars 13273 1726853318.75385: variable 'port2_profile' from source: play vars 13273 1726853318.75421: variable 'port2_profile' from source: play vars 13273 1726853318.75435: variable 'dhcp_interface2' from source: play vars 13273 1726853318.75497: variable 'dhcp_interface2' from source: play vars 13273 1726853318.75508: variable 'controller_profile' from source: play vars 13273 1726853318.75566: variable 'controller_profile' from source: play vars 13273 1726853318.75732: variable 'omit' from source: magic vars 13273 1726853318.75746: variable '__lsr_ansible_managed' from source: task vars 13273 1726853318.75806: variable '__lsr_ansible_managed' from source: task vars 13273 1726853318.75977: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 13273 1726853318.76179: Loaded config def from plugin (lookup/template) 13273 1726853318.76188: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 13273 1726853318.76216: File lookup term: get_ansible_managed.j2 13273 1726853318.76224: variable 'ansible_search_path' from source: unknown 13273 1726853318.76276: evaluation_path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 13273 1726853318.76281: search_path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 13273 1726853318.76284: variable 'ansible_search_path' from source: unknown 13273 1726853318.82968: variable 'ansible_managed' from source: unknown 13273 1726853318.83095: variable 'omit' from source: magic vars 13273 1726853318.83124: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13273 1726853318.83151: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13273 1726853318.83276: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13273 1726853318.83279: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853318.83281: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853318.83284: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13273 1726853318.83286: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853318.83288: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853318.83317: Set connection var ansible_connection to ssh 13273 1726853318.83331: Set connection var ansible_module_compression to ZIP_DEFLATED 13273 1726853318.83341: Set connection var ansible_shell_executable to /bin/sh 13273 1726853318.83347: Set connection var ansible_shell_type to sh 13273 1726853318.83364: Set connection var ansible_pipelining to False 13273 1726853318.83376: Set connection var ansible_timeout to 10 13273 1726853318.83407: variable 'ansible_shell_executable' from source: unknown 13273 1726853318.83416: variable 'ansible_connection' from source: unknown 13273 1726853318.83423: variable 'ansible_module_compression' from source: unknown 13273 1726853318.83430: variable 'ansible_shell_type' from source: unknown 13273 1726853318.83436: variable 'ansible_shell_executable' from source: unknown 13273 1726853318.83442: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853318.83449: variable 'ansible_pipelining' from source: unknown 13273 1726853318.83456: variable 'ansible_timeout' from source: unknown 13273 1726853318.83463: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853318.83580: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 13273 1726853318.83593: variable 'omit' from source: magic vars 13273 1726853318.83604: starting attempt loop 13273 1726853318.83610: running the handler 13273 1726853318.83622: _low_level_execute_command(): starting 13273 1726853318.83631: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13273 1726853318.84266: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13273 1726853318.84387: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853318.84402: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853318.84418: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853318.84513: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853318.86225: stdout chunk (state=3): >>>/root <<< 13273 1726853318.86359: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853318.86377: stdout chunk (state=3): >>><<< 13273 1726853318.86390: stderr chunk (state=3): >>><<< 13273 1726853318.86415: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853318.86433: _low_level_execute_command(): starting 13273 1726853318.86445: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853318.8642185-14979-184803658775137 `" && echo ansible-tmp-1726853318.8642185-14979-184803658775137="` echo /root/.ansible/tmp/ansible-tmp-1726853318.8642185-14979-184803658775137 `" ) && sleep 0' 13273 1726853318.87061: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13273 1726853318.87079: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853318.87095: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853318.87117: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13273 1726853318.87137: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 13273 1726853318.87237: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853318.87282: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853318.87341: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853318.89339: stdout chunk (state=3): >>>ansible-tmp-1726853318.8642185-14979-184803658775137=/root/.ansible/tmp/ansible-tmp-1726853318.8642185-14979-184803658775137 <<< 13273 1726853318.89485: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853318.89496: stdout chunk (state=3): >>><<< 13273 1726853318.89509: stderr chunk (state=3): >>><<< 13273 1726853318.89534: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853318.8642185-14979-184803658775137=/root/.ansible/tmp/ansible-tmp-1726853318.8642185-14979-184803658775137 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853318.89675: variable 'ansible_module_compression' from source: unknown 13273 1726853318.89679: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13273pukop8ph/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 13273 1726853318.89681: variable 'ansible_facts' from source: unknown 13273 1726853318.89783: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853318.8642185-14979-184803658775137/AnsiballZ_network_connections.py 13273 1726853318.89925: Sending initial data 13273 1726853318.90033: Sent initial data (168 bytes) 13273 1726853318.90596: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 13273 1726853318.90610: stderr chunk (state=3): >>>debug2: match not found <<< 13273 1726853318.90625: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853318.90644: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13273 1726853318.90705: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853318.90746: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853318.90764: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853318.90788: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853318.90884: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853318.92562: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13273 1726853318.92648: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13273 1726853318.92718: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13273pukop8ph/tmpykwsix31 /root/.ansible/tmp/ansible-tmp-1726853318.8642185-14979-184803658775137/AnsiballZ_network_connections.py <<< 13273 1726853318.92730: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853318.8642185-14979-184803658775137/AnsiballZ_network_connections.py" <<< 13273 1726853318.92776: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory <<< 13273 1726853318.92803: stderr chunk (state=3): >>>debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13273pukop8ph/tmpykwsix31" to remote "/root/.ansible/tmp/ansible-tmp-1726853318.8642185-14979-184803658775137/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853318.8642185-14979-184803658775137/AnsiballZ_network_connections.py" <<< 13273 1726853318.94047: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853318.94051: stdout chunk (state=3): >>><<< 13273 1726853318.94053: stderr chunk (state=3): >>><<< 13273 1726853318.94055: done transferring module to remote 13273 1726853318.94057: _low_level_execute_command(): starting 13273 1726853318.94059: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853318.8642185-14979-184803658775137/ /root/.ansible/tmp/ansible-tmp-1726853318.8642185-14979-184803658775137/AnsiballZ_network_connections.py && sleep 0' 13273 1726853318.94606: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13273 1726853318.94625: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853318.94687: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853318.94749: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853318.94768: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853318.94795: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853318.94883: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853318.96807: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853318.96811: stdout chunk (state=3): >>><<< 13273 1726853318.96813: stderr chunk (state=3): >>><<< 13273 1726853318.96906: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853318.96909: _low_level_execute_command(): starting 13273 1726853318.96912: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853318.8642185-14979-184803658775137/AnsiballZ_network_connections.py && sleep 0' 13273 1726853318.97449: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13273 1726853318.97464: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853318.97529: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853318.97542: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853318.97592: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853318.97610: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853318.97660: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853318.97728: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853319.53415: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 3903334e-7358-4806-a114-5ea6dbf2cacf\n[008] #1, state:up persistent_state:present, 'bond0.0': update connection bond0.0, 4ad0cc7b-d5cb-4e1d-92ec-ada243ff0f9d\n[009] #2, state:up persistent_state:present, 'bond0.1': update connection bond0.1, 889074c2-782b-4057-b1eb-a43c769be906\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 3903334e-7358-4806-a114-5ea6dbf2cacf (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 4ad0cc7b-d5cb-4e1d-92ec-ada243ff0f9d (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 889074c2-782b-4057-b1eb-a43c769be906 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "active-backup", "miimon": 110}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "active-backup", "miimon": 110}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 13273 1726853319.55536: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853319.55550: stderr chunk (state=3): >>>Shared connection to 10.31.11.217 closed. <<< 13273 1726853319.55608: stderr chunk (state=3): >>><<< 13273 1726853319.55622: stdout chunk (state=3): >>><<< 13273 1726853319.55644: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 3903334e-7358-4806-a114-5ea6dbf2cacf\n[008] #1, state:up persistent_state:present, 'bond0.0': update connection bond0.0, 4ad0cc7b-d5cb-4e1d-92ec-ada243ff0f9d\n[009] #2, state:up persistent_state:present, 'bond0.1': update connection bond0.1, 889074c2-782b-4057-b1eb-a43c769be906\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 3903334e-7358-4806-a114-5ea6dbf2cacf (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 4ad0cc7b-d5cb-4e1d-92ec-ada243ff0f9d (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 889074c2-782b-4057-b1eb-a43c769be906 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "active-backup", "miimon": 110}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "active-backup", "miimon": 110}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 13273 1726853319.55713: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'bond0', 'state': 'up', 'type': 'bond', 'interface_name': 'nm-bond', 'bond': {'mode': 'active-backup', 'miimon': 110}, 'ip': {'route_metric4': 65535}}, {'name': 'bond0.0', 'state': 'up', 'type': 'ethernet', 'interface_name': 'test1', 'controller': 'bond0'}, {'name': 'bond0.1', 'state': 'up', 'type': 'ethernet', 'interface_name': 'test2', 'controller': 'bond0'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853318.8642185-14979-184803658775137/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13273 1726853319.55793: _low_level_execute_command(): starting 13273 1726853319.55796: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853318.8642185-14979-184803658775137/ > /dev/null 2>&1 && sleep 0' 13273 1726853319.56335: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13273 1726853319.56349: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853319.56361: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853319.56381: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13273 1726853319.56436: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853319.56548: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853319.56605: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853319.56673: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853319.56769: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853319.58787: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853319.58815: stdout chunk (state=3): >>><<< 13273 1726853319.58819: stderr chunk (state=3): >>><<< 13273 1726853319.58838: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853319.58976: handler run complete 13273 1726853319.58979: attempt loop complete, returning result 13273 1726853319.58982: _execute() done 13273 1726853319.58984: dumping result to json 13273 1726853319.58986: done dumping result, returning 13273 1726853319.58988: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [02083763-bbaf-5fc3-657d-0000000000df] 13273 1726853319.58990: sending task result for task 02083763-bbaf-5fc3-657d-0000000000df 13273 1726853319.59069: done sending task result for task 02083763-bbaf-5fc3-657d-0000000000df 13273 1726853319.59075: WORKER PROCESS EXITING changed: [managed_node3] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "bond": { "miimon": 110, "mode": "active-backup" }, "interface_name": "nm-bond", "ip": { "route_metric4": 65535 }, "name": "bond0", "state": "up", "type": "bond" }, { "controller": "bond0", "interface_name": "test1", "name": "bond0.0", "state": "up", "type": "ethernet" }, { "controller": "bond0", "interface_name": "test2", "name": "bond0.1", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 3903334e-7358-4806-a114-5ea6dbf2cacf [008] #1, state:up persistent_state:present, 'bond0.0': update connection bond0.0, 4ad0cc7b-d5cb-4e1d-92ec-ada243ff0f9d [009] #2, state:up persistent_state:present, 'bond0.1': update connection bond0.1, 889074c2-782b-4057-b1eb-a43c769be906 [010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 3903334e-7358-4806-a114-5ea6dbf2cacf (is-modified) [011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 4ad0cc7b-d5cb-4e1d-92ec-ada243ff0f9d (not-active) [012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 889074c2-782b-4057-b1eb-a43c769be906 (not-active) 13273 1726853319.59412: no more pending results, returning what we have 13273 1726853319.59415: results queue empty 13273 1726853319.59416: checking for any_errors_fatal 13273 1726853319.59422: done checking for any_errors_fatal 13273 1726853319.59423: checking for max_fail_percentage 13273 1726853319.59425: done checking for max_fail_percentage 13273 1726853319.59426: checking to see if all hosts have failed and the running result is not ok 13273 1726853319.59427: done checking to see if all hosts have failed 13273 1726853319.59428: getting the remaining hosts for this loop 13273 1726853319.59429: done getting the remaining hosts for this loop 13273 1726853319.59432: getting the next task for host managed_node3 13273 1726853319.59438: done getting next task for host managed_node3 13273 1726853319.59441: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 13273 1726853319.59444: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853319.59457: getting variables 13273 1726853319.59459: in VariableManager get_vars() 13273 1726853319.59514: Calling all_inventory to load vars for managed_node3 13273 1726853319.59517: Calling groups_inventory to load vars for managed_node3 13273 1726853319.59519: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853319.59528: Calling all_plugins_play to load vars for managed_node3 13273 1726853319.59531: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853319.59534: Calling groups_plugins_play to load vars for managed_node3 13273 1726853319.71697: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853319.75010: done with get_vars() 13273 1726853319.75043: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 13:28:39 -0400 (0:00:01.150) 0:00:37.641 ****** 13273 1726853319.75231: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_state 13273 1726853319.75997: worker is 1 (out of 1 available) 13273 1726853319.76009: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_state 13273 1726853319.76019: done queuing things up, now waiting for results queue to drain 13273 1726853319.76021: waiting for pending results... 13273 1726853319.76693: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state 13273 1726853319.76697: in run() - task 02083763-bbaf-5fc3-657d-0000000000e0 13273 1726853319.76701: variable 'ansible_search_path' from source: unknown 13273 1726853319.76789: variable 'ansible_search_path' from source: unknown 13273 1726853319.76794: calling self._execute() 13273 1726853319.76944: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853319.76957: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853319.77023: variable 'omit' from source: magic vars 13273 1726853319.78080: variable 'ansible_distribution_major_version' from source: facts 13273 1726853319.78086: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853319.78091: variable 'network_state' from source: role '' defaults 13273 1726853319.78095: Evaluated conditional (network_state != {}): False 13273 1726853319.78098: when evaluation is False, skipping this task 13273 1726853319.78267: _execute() done 13273 1726853319.78309: dumping result to json 13273 1726853319.78319: done dumping result, returning 13273 1726853319.78336: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state [02083763-bbaf-5fc3-657d-0000000000e0] 13273 1726853319.78384: sending task result for task 02083763-bbaf-5fc3-657d-0000000000e0 skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 13273 1726853319.78565: no more pending results, returning what we have 13273 1726853319.78570: results queue empty 13273 1726853319.78573: checking for any_errors_fatal 13273 1726853319.78592: done checking for any_errors_fatal 13273 1726853319.78593: checking for max_fail_percentage 13273 1726853319.78595: done checking for max_fail_percentage 13273 1726853319.78596: checking to see if all hosts have failed and the running result is not ok 13273 1726853319.78597: done checking to see if all hosts have failed 13273 1726853319.78597: getting the remaining hosts for this loop 13273 1726853319.78599: done getting the remaining hosts for this loop 13273 1726853319.78602: getting the next task for host managed_node3 13273 1726853319.78609: done getting next task for host managed_node3 13273 1726853319.78613: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 13273 1726853319.78616: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853319.78642: getting variables 13273 1726853319.78644: in VariableManager get_vars() 13273 1726853319.78903: Calling all_inventory to load vars for managed_node3 13273 1726853319.78906: Calling groups_inventory to load vars for managed_node3 13273 1726853319.78908: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853319.78919: Calling all_plugins_play to load vars for managed_node3 13273 1726853319.78921: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853319.78924: Calling groups_plugins_play to load vars for managed_node3 13273 1726853319.79687: done sending task result for task 02083763-bbaf-5fc3-657d-0000000000e0 13273 1726853319.79691: WORKER PROCESS EXITING 13273 1726853319.81669: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853319.83276: done with get_vars() 13273 1726853319.83301: done getting variables 13273 1726853319.83370: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 13:28:39 -0400 (0:00:00.081) 0:00:37.723 ****** 13273 1726853319.83410: entering _queue_task() for managed_node3/debug 13273 1726853319.83976: worker is 1 (out of 1 available) 13273 1726853319.83987: exiting _queue_task() for managed_node3/debug 13273 1726853319.83998: done queuing things up, now waiting for results queue to drain 13273 1726853319.83999: waiting for pending results... 13273 1726853319.84097: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 13273 1726853319.84244: in run() - task 02083763-bbaf-5fc3-657d-0000000000e1 13273 1726853319.84265: variable 'ansible_search_path' from source: unknown 13273 1726853319.84275: variable 'ansible_search_path' from source: unknown 13273 1726853319.84319: calling self._execute() 13273 1726853319.84430: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853319.84451: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853319.84465: variable 'omit' from source: magic vars 13273 1726853319.84864: variable 'ansible_distribution_major_version' from source: facts 13273 1726853319.84886: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853319.84897: variable 'omit' from source: magic vars 13273 1726853319.84956: variable 'omit' from source: magic vars 13273 1726853319.84999: variable 'omit' from source: magic vars 13273 1726853319.85043: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13273 1726853319.85090: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13273 1726853319.85116: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13273 1726853319.85136: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853319.85155: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853319.85206: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13273 1726853319.85209: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853319.85212: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853319.85315: Set connection var ansible_connection to ssh 13273 1726853319.85424: Set connection var ansible_module_compression to ZIP_DEFLATED 13273 1726853319.85427: Set connection var ansible_shell_executable to /bin/sh 13273 1726853319.85430: Set connection var ansible_shell_type to sh 13273 1726853319.85432: Set connection var ansible_pipelining to False 13273 1726853319.85434: Set connection var ansible_timeout to 10 13273 1726853319.85436: variable 'ansible_shell_executable' from source: unknown 13273 1726853319.85438: variable 'ansible_connection' from source: unknown 13273 1726853319.85441: variable 'ansible_module_compression' from source: unknown 13273 1726853319.85443: variable 'ansible_shell_type' from source: unknown 13273 1726853319.85447: variable 'ansible_shell_executable' from source: unknown 13273 1726853319.85449: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853319.85450: variable 'ansible_pipelining' from source: unknown 13273 1726853319.85452: variable 'ansible_timeout' from source: unknown 13273 1726853319.85454: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853319.85573: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13273 1726853319.85590: variable 'omit' from source: magic vars 13273 1726853319.85601: starting attempt loop 13273 1726853319.85607: running the handler 13273 1726853319.85976: variable '__network_connections_result' from source: set_fact 13273 1726853319.85979: handler run complete 13273 1726853319.85982: attempt loop complete, returning result 13273 1726853319.85984: _execute() done 13273 1726853319.85987: dumping result to json 13273 1726853319.85989: done dumping result, returning 13273 1726853319.86084: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [02083763-bbaf-5fc3-657d-0000000000e1] 13273 1726853319.86093: sending task result for task 02083763-bbaf-5fc3-657d-0000000000e1 ok: [managed_node3] => { "__network_connections_result.stderr_lines": [ "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 3903334e-7358-4806-a114-5ea6dbf2cacf", "[008] #1, state:up persistent_state:present, 'bond0.0': update connection bond0.0, 4ad0cc7b-d5cb-4e1d-92ec-ada243ff0f9d", "[009] #2, state:up persistent_state:present, 'bond0.1': update connection bond0.1, 889074c2-782b-4057-b1eb-a43c769be906", "[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 3903334e-7358-4806-a114-5ea6dbf2cacf (is-modified)", "[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 4ad0cc7b-d5cb-4e1d-92ec-ada243ff0f9d (not-active)", "[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 889074c2-782b-4057-b1eb-a43c769be906 (not-active)" ] } 13273 1726853319.86280: no more pending results, returning what we have 13273 1726853319.86284: results queue empty 13273 1726853319.86286: checking for any_errors_fatal 13273 1726853319.86290: done checking for any_errors_fatal 13273 1726853319.86291: checking for max_fail_percentage 13273 1726853319.86292: done checking for max_fail_percentage 13273 1726853319.86293: checking to see if all hosts have failed and the running result is not ok 13273 1726853319.86294: done checking to see if all hosts have failed 13273 1726853319.86295: getting the remaining hosts for this loop 13273 1726853319.86296: done getting the remaining hosts for this loop 13273 1726853319.86300: getting the next task for host managed_node3 13273 1726853319.86306: done getting next task for host managed_node3 13273 1726853319.86310: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 13273 1726853319.86313: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853319.86326: getting variables 13273 1726853319.86327: in VariableManager get_vars() 13273 1726853319.86788: Calling all_inventory to load vars for managed_node3 13273 1726853319.86792: Calling groups_inventory to load vars for managed_node3 13273 1726853319.86795: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853319.86804: Calling all_plugins_play to load vars for managed_node3 13273 1726853319.86807: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853319.86810: Calling groups_plugins_play to load vars for managed_node3 13273 1726853319.87561: done sending task result for task 02083763-bbaf-5fc3-657d-0000000000e1 13273 1726853319.87564: WORKER PROCESS EXITING 13273 1726853319.88553: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853319.90448: done with get_vars() 13273 1726853319.90683: done getting variables 13273 1726853319.90747: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 13:28:39 -0400 (0:00:00.073) 0:00:37.797 ****** 13273 1726853319.90793: entering _queue_task() for managed_node3/debug 13273 1726853319.91433: worker is 1 (out of 1 available) 13273 1726853319.91491: exiting _queue_task() for managed_node3/debug 13273 1726853319.91503: done queuing things up, now waiting for results queue to drain 13273 1726853319.91504: waiting for pending results... 13273 1726853319.91769: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 13273 1726853319.91921: in run() - task 02083763-bbaf-5fc3-657d-0000000000e2 13273 1726853319.91939: variable 'ansible_search_path' from source: unknown 13273 1726853319.91948: variable 'ansible_search_path' from source: unknown 13273 1726853319.91989: calling self._execute() 13273 1726853319.92106: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853319.92122: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853319.92135: variable 'omit' from source: magic vars 13273 1726853319.92518: variable 'ansible_distribution_major_version' from source: facts 13273 1726853319.92534: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853319.92552: variable 'omit' from source: magic vars 13273 1726853319.92606: variable 'omit' from source: magic vars 13273 1726853319.92653: variable 'omit' from source: magic vars 13273 1726853319.92702: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13273 1726853319.92747: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13273 1726853319.92777: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13273 1726853319.92798: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853319.92814: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853319.92877: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13273 1726853319.92880: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853319.92883: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853319.92964: Set connection var ansible_connection to ssh 13273 1726853319.92985: Set connection var ansible_module_compression to ZIP_DEFLATED 13273 1726853319.92995: Set connection var ansible_shell_executable to /bin/sh 13273 1726853319.93001: Set connection var ansible_shell_type to sh 13273 1726853319.93076: Set connection var ansible_pipelining to False 13273 1726853319.93079: Set connection var ansible_timeout to 10 13273 1726853319.93081: variable 'ansible_shell_executable' from source: unknown 13273 1726853319.93083: variable 'ansible_connection' from source: unknown 13273 1726853319.93089: variable 'ansible_module_compression' from source: unknown 13273 1726853319.93094: variable 'ansible_shell_type' from source: unknown 13273 1726853319.93096: variable 'ansible_shell_executable' from source: unknown 13273 1726853319.93098: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853319.93100: variable 'ansible_pipelining' from source: unknown 13273 1726853319.93102: variable 'ansible_timeout' from source: unknown 13273 1726853319.93104: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853319.93240: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13273 1726853319.93260: variable 'omit' from source: magic vars 13273 1726853319.93273: starting attempt loop 13273 1726853319.93281: running the handler 13273 1726853319.93338: variable '__network_connections_result' from source: set_fact 13273 1726853319.93426: variable '__network_connections_result' from source: set_fact 13273 1726853319.93610: handler run complete 13273 1726853319.93749: attempt loop complete, returning result 13273 1726853319.93752: _execute() done 13273 1726853319.93754: dumping result to json 13273 1726853319.93756: done dumping result, returning 13273 1726853319.93759: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [02083763-bbaf-5fc3-657d-0000000000e2] 13273 1726853319.93761: sending task result for task 02083763-bbaf-5fc3-657d-0000000000e2 13273 1726853319.93839: done sending task result for task 02083763-bbaf-5fc3-657d-0000000000e2 13273 1726853319.93842: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "bond": { "miimon": 110, "mode": "active-backup" }, "interface_name": "nm-bond", "ip": { "route_metric4": 65535 }, "name": "bond0", "state": "up", "type": "bond" }, { "controller": "bond0", "interface_name": "test1", "name": "bond0.0", "state": "up", "type": "ethernet" }, { "controller": "bond0", "interface_name": "test2", "name": "bond0.1", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 3903334e-7358-4806-a114-5ea6dbf2cacf\n[008] #1, state:up persistent_state:present, 'bond0.0': update connection bond0.0, 4ad0cc7b-d5cb-4e1d-92ec-ada243ff0f9d\n[009] #2, state:up persistent_state:present, 'bond0.1': update connection bond0.1, 889074c2-782b-4057-b1eb-a43c769be906\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 3903334e-7358-4806-a114-5ea6dbf2cacf (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 4ad0cc7b-d5cb-4e1d-92ec-ada243ff0f9d (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 889074c2-782b-4057-b1eb-a43c769be906 (not-active)\n", "stderr_lines": [ "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 3903334e-7358-4806-a114-5ea6dbf2cacf", "[008] #1, state:up persistent_state:present, 'bond0.0': update connection bond0.0, 4ad0cc7b-d5cb-4e1d-92ec-ada243ff0f9d", "[009] #2, state:up persistent_state:present, 'bond0.1': update connection bond0.1, 889074c2-782b-4057-b1eb-a43c769be906", "[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 3903334e-7358-4806-a114-5ea6dbf2cacf (is-modified)", "[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 4ad0cc7b-d5cb-4e1d-92ec-ada243ff0f9d (not-active)", "[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 889074c2-782b-4057-b1eb-a43c769be906 (not-active)" ] } } 13273 1726853319.93962: no more pending results, returning what we have 13273 1726853319.93966: results queue empty 13273 1726853319.93975: checking for any_errors_fatal 13273 1726853319.93982: done checking for any_errors_fatal 13273 1726853319.93983: checking for max_fail_percentage 13273 1726853319.93985: done checking for max_fail_percentage 13273 1726853319.93986: checking to see if all hosts have failed and the running result is not ok 13273 1726853319.93987: done checking to see if all hosts have failed 13273 1726853319.93987: getting the remaining hosts for this loop 13273 1726853319.93989: done getting the remaining hosts for this loop 13273 1726853319.93993: getting the next task for host managed_node3 13273 1726853319.94000: done getting next task for host managed_node3 13273 1726853319.94004: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 13273 1726853319.94007: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853319.94022: getting variables 13273 1726853319.94023: in VariableManager get_vars() 13273 1726853319.94383: Calling all_inventory to load vars for managed_node3 13273 1726853319.94386: Calling groups_inventory to load vars for managed_node3 13273 1726853319.94388: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853319.94397: Calling all_plugins_play to load vars for managed_node3 13273 1726853319.94399: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853319.94402: Calling groups_plugins_play to load vars for managed_node3 13273 1726853319.95955: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853319.97643: done with get_vars() 13273 1726853319.97678: done getting variables 13273 1726853319.97788: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 13:28:39 -0400 (0:00:00.070) 0:00:37.867 ****** 13273 1726853319.97855: entering _queue_task() for managed_node3/debug 13273 1726853319.98260: worker is 1 (out of 1 available) 13273 1726853319.98477: exiting _queue_task() for managed_node3/debug 13273 1726853319.98488: done queuing things up, now waiting for results queue to drain 13273 1726853319.98489: waiting for pending results... 13273 1726853319.98617: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 13273 1726853319.98825: in run() - task 02083763-bbaf-5fc3-657d-0000000000e3 13273 1726853319.98829: variable 'ansible_search_path' from source: unknown 13273 1726853319.98832: variable 'ansible_search_path' from source: unknown 13273 1726853319.98834: calling self._execute() 13273 1726853319.98941: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853319.98959: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853319.98977: variable 'omit' from source: magic vars 13273 1726853319.99399: variable 'ansible_distribution_major_version' from source: facts 13273 1726853319.99417: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853319.99608: variable 'network_state' from source: role '' defaults 13273 1726853319.99658: Evaluated conditional (network_state != {}): False 13273 1726853319.99667: when evaluation is False, skipping this task 13273 1726853319.99727: _execute() done 13273 1726853319.99730: dumping result to json 13273 1726853319.99733: done dumping result, returning 13273 1726853319.99736: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [02083763-bbaf-5fc3-657d-0000000000e3] 13273 1726853319.99738: sending task result for task 02083763-bbaf-5fc3-657d-0000000000e3 13273 1726853320.00137: done sending task result for task 02083763-bbaf-5fc3-657d-0000000000e3 13273 1726853320.00140: WORKER PROCESS EXITING skipping: [managed_node3] => { "false_condition": "network_state != {}" } 13273 1726853320.00321: no more pending results, returning what we have 13273 1726853320.00325: results queue empty 13273 1726853320.00326: checking for any_errors_fatal 13273 1726853320.00333: done checking for any_errors_fatal 13273 1726853320.00334: checking for max_fail_percentage 13273 1726853320.00335: done checking for max_fail_percentage 13273 1726853320.00336: checking to see if all hosts have failed and the running result is not ok 13273 1726853320.00337: done checking to see if all hosts have failed 13273 1726853320.00338: getting the remaining hosts for this loop 13273 1726853320.00339: done getting the remaining hosts for this loop 13273 1726853320.00343: getting the next task for host managed_node3 13273 1726853320.00350: done getting next task for host managed_node3 13273 1726853320.00355: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 13273 1726853320.00358: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853320.00380: getting variables 13273 1726853320.00382: in VariableManager get_vars() 13273 1726853320.00435: Calling all_inventory to load vars for managed_node3 13273 1726853320.00438: Calling groups_inventory to load vars for managed_node3 13273 1726853320.00441: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853320.00453: Calling all_plugins_play to load vars for managed_node3 13273 1726853320.00455: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853320.00458: Calling groups_plugins_play to load vars for managed_node3 13273 1726853320.03013: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853320.04638: done with get_vars() 13273 1726853320.04669: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 13:28:40 -0400 (0:00:00.069) 0:00:37.936 ****** 13273 1726853320.04768: entering _queue_task() for managed_node3/ping 13273 1726853320.05130: worker is 1 (out of 1 available) 13273 1726853320.05143: exiting _queue_task() for managed_node3/ping 13273 1726853320.05157: done queuing things up, now waiting for results queue to drain 13273 1726853320.05158: waiting for pending results... 13273 1726853320.05534: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 13273 1726853320.05762: in run() - task 02083763-bbaf-5fc3-657d-0000000000e4 13273 1726853320.05766: variable 'ansible_search_path' from source: unknown 13273 1726853320.05768: variable 'ansible_search_path' from source: unknown 13273 1726853320.05772: calling self._execute() 13273 1726853320.05888: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853320.05901: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853320.05913: variable 'omit' from source: magic vars 13273 1726853320.06314: variable 'ansible_distribution_major_version' from source: facts 13273 1726853320.06340: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853320.06353: variable 'omit' from source: magic vars 13273 1726853320.06422: variable 'omit' from source: magic vars 13273 1726853320.06464: variable 'omit' from source: magic vars 13273 1726853320.06511: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13273 1726853320.06676: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13273 1726853320.06679: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13273 1726853320.06681: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853320.06683: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853320.06685: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13273 1726853320.06687: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853320.06689: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853320.06774: Set connection var ansible_connection to ssh 13273 1726853320.06790: Set connection var ansible_module_compression to ZIP_DEFLATED 13273 1726853320.06803: Set connection var ansible_shell_executable to /bin/sh 13273 1726853320.06810: Set connection var ansible_shell_type to sh 13273 1726853320.06819: Set connection var ansible_pipelining to False 13273 1726853320.06827: Set connection var ansible_timeout to 10 13273 1726853320.06859: variable 'ansible_shell_executable' from source: unknown 13273 1726853320.06891: variable 'ansible_connection' from source: unknown 13273 1726853320.06899: variable 'ansible_module_compression' from source: unknown 13273 1726853320.06907: variable 'ansible_shell_type' from source: unknown 13273 1726853320.06916: variable 'ansible_shell_executable' from source: unknown 13273 1726853320.06923: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853320.06940: variable 'ansible_pipelining' from source: unknown 13273 1726853320.06950: variable 'ansible_timeout' from source: unknown 13273 1726853320.06983: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853320.07247: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 13273 1726853320.07251: variable 'omit' from source: magic vars 13273 1726853320.07253: starting attempt loop 13273 1726853320.07256: running the handler 13273 1726853320.07258: _low_level_execute_command(): starting 13273 1726853320.07259: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13273 1726853320.07965: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13273 1726853320.07987: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853320.08011: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853320.08093: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853320.08136: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853320.08165: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853320.08181: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853320.08278: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853320.10011: stdout chunk (state=3): >>>/root <<< 13273 1726853320.10162: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853320.10165: stdout chunk (state=3): >>><<< 13273 1726853320.10168: stderr chunk (state=3): >>><<< 13273 1726853320.10193: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853320.10215: _low_level_execute_command(): starting 13273 1726853320.10239: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853320.1020055-15027-64786972311580 `" && echo ansible-tmp-1726853320.1020055-15027-64786972311580="` echo /root/.ansible/tmp/ansible-tmp-1726853320.1020055-15027-64786972311580 `" ) && sleep 0' 13273 1726853320.10856: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13273 1726853320.10874: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853320.10890: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853320.10905: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13273 1726853320.10938: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration <<< 13273 1726853320.11042: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853320.11046: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853320.11069: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853320.11157: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853320.13142: stdout chunk (state=3): >>>ansible-tmp-1726853320.1020055-15027-64786972311580=/root/.ansible/tmp/ansible-tmp-1726853320.1020055-15027-64786972311580 <<< 13273 1726853320.13289: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853320.13301: stderr chunk (state=3): >>><<< 13273 1726853320.13310: stdout chunk (state=3): >>><<< 13273 1726853320.13345: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853320.1020055-15027-64786972311580=/root/.ansible/tmp/ansible-tmp-1726853320.1020055-15027-64786972311580 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853320.13402: variable 'ansible_module_compression' from source: unknown 13273 1726853320.13501: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13273pukop8ph/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 13273 1726853320.13504: variable 'ansible_facts' from source: unknown 13273 1726853320.13617: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853320.1020055-15027-64786972311580/AnsiballZ_ping.py 13273 1726853320.13795: Sending initial data 13273 1726853320.13811: Sent initial data (152 bytes) 13273 1726853320.14488: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853320.14548: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853320.14566: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853320.14600: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853320.14686: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853320.16325: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 13273 1726853320.16362: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13273 1726853320.16406: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13273 1726853320.16566: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13273pukop8ph/tmp22myzll3 /root/.ansible/tmp/ansible-tmp-1726853320.1020055-15027-64786972311580/AnsiballZ_ping.py <<< 13273 1726853320.16569: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853320.1020055-15027-64786972311580/AnsiballZ_ping.py" <<< 13273 1726853320.16794: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13273pukop8ph/tmp22myzll3" to remote "/root/.ansible/tmp/ansible-tmp-1726853320.1020055-15027-64786972311580/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853320.1020055-15027-64786972311580/AnsiballZ_ping.py" <<< 13273 1726853320.17504: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853320.17583: stderr chunk (state=3): >>><<< 13273 1726853320.17594: stdout chunk (state=3): >>><<< 13273 1726853320.17640: done transferring module to remote 13273 1726853320.17660: _low_level_execute_command(): starting 13273 1726853320.17676: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853320.1020055-15027-64786972311580/ /root/.ansible/tmp/ansible-tmp-1726853320.1020055-15027-64786972311580/AnsiballZ_ping.py && sleep 0' 13273 1726853320.18331: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853320.18350: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13273 1726853320.18385: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853320.18458: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853320.18488: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853320.18501: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853320.18582: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853320.20839: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853320.20843: stdout chunk (state=3): >>><<< 13273 1726853320.20848: stderr chunk (state=3): >>><<< 13273 1726853320.20851: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853320.20853: _low_level_execute_command(): starting 13273 1726853320.20855: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853320.1020055-15027-64786972311580/AnsiballZ_ping.py && sleep 0' 13273 1726853320.21498: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13273 1726853320.21512: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853320.21525: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853320.21542: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13273 1726853320.21564: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 13273 1726853320.21579: stderr chunk (state=3): >>>debug2: match not found <<< 13273 1726853320.21593: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853320.21609: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13273 1726853320.21620: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address <<< 13273 1726853320.21686: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853320.21717: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853320.21735: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853320.21760: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853320.21854: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853320.37283: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 13273 1726853320.38560: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 13273 1726853320.38730: stderr chunk (state=3): >>><<< 13273 1726853320.38733: stdout chunk (state=3): >>><<< 13273 1726853320.38736: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 13273 1726853320.38738: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853320.1020055-15027-64786972311580/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13273 1726853320.38741: _low_level_execute_command(): starting 13273 1726853320.38744: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853320.1020055-15027-64786972311580/ > /dev/null 2>&1 && sleep 0' 13273 1726853320.39896: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853320.39996: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13273 1726853320.40115: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853320.40225: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853320.40315: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853320.42347: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853320.42357: stdout chunk (state=3): >>><<< 13273 1726853320.42359: stderr chunk (state=3): >>><<< 13273 1726853320.42605: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853320.42609: handler run complete 13273 1726853320.42612: attempt loop complete, returning result 13273 1726853320.42614: _execute() done 13273 1726853320.42615: dumping result to json 13273 1726853320.42617: done dumping result, returning 13273 1726853320.42619: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [02083763-bbaf-5fc3-657d-0000000000e4] 13273 1726853320.42622: sending task result for task 02083763-bbaf-5fc3-657d-0000000000e4 13273 1726853320.42696: done sending task result for task 02083763-bbaf-5fc3-657d-0000000000e4 13273 1726853320.42699: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "ping": "pong" } 13273 1726853320.42776: no more pending results, returning what we have 13273 1726853320.42780: results queue empty 13273 1726853320.42781: checking for any_errors_fatal 13273 1726853320.42788: done checking for any_errors_fatal 13273 1726853320.42789: checking for max_fail_percentage 13273 1726853320.42791: done checking for max_fail_percentage 13273 1726853320.42792: checking to see if all hosts have failed and the running result is not ok 13273 1726853320.42793: done checking to see if all hosts have failed 13273 1726853320.42793: getting the remaining hosts for this loop 13273 1726853320.42795: done getting the remaining hosts for this loop 13273 1726853320.42798: getting the next task for host managed_node3 13273 1726853320.42809: done getting next task for host managed_node3 13273 1726853320.42812: ^ task is: TASK: meta (role_complete) 13273 1726853320.42815: ^ state is: HOST STATE: block=2, task=28, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853320.42830: getting variables 13273 1726853320.42832: in VariableManager get_vars() 13273 1726853320.43102: Calling all_inventory to load vars for managed_node3 13273 1726853320.43106: Calling groups_inventory to load vars for managed_node3 13273 1726853320.43109: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853320.43120: Calling all_plugins_play to load vars for managed_node3 13273 1726853320.43124: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853320.43128: Calling groups_plugins_play to load vars for managed_node3 13273 1726853320.46391: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853320.49631: done with get_vars() 13273 1726853320.49658: done getting variables 13273 1726853320.49862: done queuing things up, now waiting for results queue to drain 13273 1726853320.49864: results queue empty 13273 1726853320.49865: checking for any_errors_fatal 13273 1726853320.49868: done checking for any_errors_fatal 13273 1726853320.49869: checking for max_fail_percentage 13273 1726853320.49870: done checking for max_fail_percentage 13273 1726853320.49936: checking to see if all hosts have failed and the running result is not ok 13273 1726853320.49937: done checking to see if all hosts have failed 13273 1726853320.49938: getting the remaining hosts for this loop 13273 1726853320.49939: done getting the remaining hosts for this loop 13273 1726853320.49942: getting the next task for host managed_node3 13273 1726853320.49947: done getting next task for host managed_node3 13273 1726853320.49951: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 13273 1726853320.49953: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853320.49964: getting variables 13273 1726853320.49965: in VariableManager get_vars() 13273 1726853320.49991: Calling all_inventory to load vars for managed_node3 13273 1726853320.49993: Calling groups_inventory to load vars for managed_node3 13273 1726853320.49995: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853320.50001: Calling all_plugins_play to load vars for managed_node3 13273 1726853320.50003: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853320.50006: Calling groups_plugins_play to load vars for managed_node3 13273 1726853320.52158: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853320.53952: done with get_vars() 13273 1726853320.53985: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 13:28:40 -0400 (0:00:00.492) 0:00:38.429 ****** 13273 1726853320.54064: entering _queue_task() for managed_node3/include_tasks 13273 1726853320.54430: worker is 1 (out of 1 available) 13273 1726853320.54446: exiting _queue_task() for managed_node3/include_tasks 13273 1726853320.54458: done queuing things up, now waiting for results queue to drain 13273 1726853320.54459: waiting for pending results... 13273 1726853320.54800: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 13273 1726853320.54924: in run() - task 02083763-bbaf-5fc3-657d-00000000011b 13273 1726853320.54936: variable 'ansible_search_path' from source: unknown 13273 1726853320.54940: variable 'ansible_search_path' from source: unknown 13273 1726853320.54977: calling self._execute() 13273 1726853320.55054: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853320.55059: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853320.55067: variable 'omit' from source: magic vars 13273 1726853320.55359: variable 'ansible_distribution_major_version' from source: facts 13273 1726853320.55370: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853320.55377: _execute() done 13273 1726853320.55380: dumping result to json 13273 1726853320.55382: done dumping result, returning 13273 1726853320.55389: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [02083763-bbaf-5fc3-657d-00000000011b] 13273 1726853320.55394: sending task result for task 02083763-bbaf-5fc3-657d-00000000011b 13273 1726853320.55479: done sending task result for task 02083763-bbaf-5fc3-657d-00000000011b 13273 1726853320.55481: WORKER PROCESS EXITING 13273 1726853320.55524: no more pending results, returning what we have 13273 1726853320.55527: in VariableManager get_vars() 13273 1726853320.55583: Calling all_inventory to load vars for managed_node3 13273 1726853320.55586: Calling groups_inventory to load vars for managed_node3 13273 1726853320.55589: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853320.55601: Calling all_plugins_play to load vars for managed_node3 13273 1726853320.55604: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853320.55607: Calling groups_plugins_play to load vars for managed_node3 13273 1726853320.56713: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853320.58634: done with get_vars() 13273 1726853320.58654: variable 'ansible_search_path' from source: unknown 13273 1726853320.58655: variable 'ansible_search_path' from source: unknown 13273 1726853320.58702: we have included files to process 13273 1726853320.58704: generating all_blocks data 13273 1726853320.58706: done generating all_blocks data 13273 1726853320.58712: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 13273 1726853320.58713: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 13273 1726853320.58716: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 13273 1726853320.59140: done processing included file 13273 1726853320.59142: iterating over new_blocks loaded from include file 13273 1726853320.59143: in VariableManager get_vars() 13273 1726853320.59164: done with get_vars() 13273 1726853320.59165: filtering new block on tags 13273 1726853320.59179: done filtering new block on tags 13273 1726853320.59181: in VariableManager get_vars() 13273 1726853320.59199: done with get_vars() 13273 1726853320.59200: filtering new block on tags 13273 1726853320.59211: done filtering new block on tags 13273 1726853320.59213: in VariableManager get_vars() 13273 1726853320.59230: done with get_vars() 13273 1726853320.59231: filtering new block on tags 13273 1726853320.59241: done filtering new block on tags 13273 1726853320.59243: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node3 13273 1726853320.59248: extending task lists for all hosts with included blocks 13273 1726853320.59763: done extending task lists 13273 1726853320.59765: done processing included files 13273 1726853320.59766: results queue empty 13273 1726853320.59767: checking for any_errors_fatal 13273 1726853320.59769: done checking for any_errors_fatal 13273 1726853320.59770: checking for max_fail_percentage 13273 1726853320.59773: done checking for max_fail_percentage 13273 1726853320.59774: checking to see if all hosts have failed and the running result is not ok 13273 1726853320.59774: done checking to see if all hosts have failed 13273 1726853320.59775: getting the remaining hosts for this loop 13273 1726853320.59776: done getting the remaining hosts for this loop 13273 1726853320.59779: getting the next task for host managed_node3 13273 1726853320.59782: done getting next task for host managed_node3 13273 1726853320.59785: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 13273 1726853320.59788: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853320.59797: getting variables 13273 1726853320.59798: in VariableManager get_vars() 13273 1726853320.59820: Calling all_inventory to load vars for managed_node3 13273 1726853320.59823: Calling groups_inventory to load vars for managed_node3 13273 1726853320.59825: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853320.59831: Calling all_plugins_play to load vars for managed_node3 13273 1726853320.59834: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853320.59837: Calling groups_plugins_play to load vars for managed_node3 13273 1726853320.61173: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853320.62547: done with get_vars() 13273 1726853320.62568: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 13:28:40 -0400 (0:00:00.085) 0:00:38.515 ****** 13273 1726853320.62645: entering _queue_task() for managed_node3/setup 13273 1726853320.63001: worker is 1 (out of 1 available) 13273 1726853320.63015: exiting _queue_task() for managed_node3/setup 13273 1726853320.63028: done queuing things up, now waiting for results queue to drain 13273 1726853320.63029: waiting for pending results... 13273 1726853320.63240: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 13273 1726853320.63344: in run() - task 02083763-bbaf-5fc3-657d-00000000084f 13273 1726853320.63359: variable 'ansible_search_path' from source: unknown 13273 1726853320.63362: variable 'ansible_search_path' from source: unknown 13273 1726853320.63395: calling self._execute() 13273 1726853320.63479: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853320.63484: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853320.63494: variable 'omit' from source: magic vars 13273 1726853320.63767: variable 'ansible_distribution_major_version' from source: facts 13273 1726853320.63778: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853320.63923: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13273 1726853320.65762: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13273 1726853320.65813: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13273 1726853320.65840: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13273 1726853320.65875: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13273 1726853320.65891: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13273 1726853320.65945: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853320.65968: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853320.65991: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853320.66016: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853320.66028: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853320.66066: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853320.66085: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853320.66103: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853320.66127: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853320.66138: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853320.66252: variable '__network_required_facts' from source: role '' defaults 13273 1726853320.66260: variable 'ansible_facts' from source: unknown 13273 1726853320.66705: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 13273 1726853320.66709: when evaluation is False, skipping this task 13273 1726853320.66712: _execute() done 13273 1726853320.66714: dumping result to json 13273 1726853320.66717: done dumping result, returning 13273 1726853320.66722: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [02083763-bbaf-5fc3-657d-00000000084f] 13273 1726853320.66727: sending task result for task 02083763-bbaf-5fc3-657d-00000000084f 13273 1726853320.66811: done sending task result for task 02083763-bbaf-5fc3-657d-00000000084f 13273 1726853320.66814: WORKER PROCESS EXITING skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 13273 1726853320.66887: no more pending results, returning what we have 13273 1726853320.66891: results queue empty 13273 1726853320.66892: checking for any_errors_fatal 13273 1726853320.66894: done checking for any_errors_fatal 13273 1726853320.66894: checking for max_fail_percentage 13273 1726853320.66896: done checking for max_fail_percentage 13273 1726853320.66897: checking to see if all hosts have failed and the running result is not ok 13273 1726853320.66897: done checking to see if all hosts have failed 13273 1726853320.66898: getting the remaining hosts for this loop 13273 1726853320.66899: done getting the remaining hosts for this loop 13273 1726853320.66902: getting the next task for host managed_node3 13273 1726853320.66910: done getting next task for host managed_node3 13273 1726853320.66914: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 13273 1726853320.66917: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853320.66938: getting variables 13273 1726853320.66939: in VariableManager get_vars() 13273 1726853320.66989: Calling all_inventory to load vars for managed_node3 13273 1726853320.66992: Calling groups_inventory to load vars for managed_node3 13273 1726853320.66994: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853320.67002: Calling all_plugins_play to load vars for managed_node3 13273 1726853320.67004: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853320.67007: Calling groups_plugins_play to load vars for managed_node3 13273 1726853320.67772: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853320.68632: done with get_vars() 13273 1726853320.68648: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 13:28:40 -0400 (0:00:00.060) 0:00:38.576 ****** 13273 1726853320.68723: entering _queue_task() for managed_node3/stat 13273 1726853320.68955: worker is 1 (out of 1 available) 13273 1726853320.68969: exiting _queue_task() for managed_node3/stat 13273 1726853320.68983: done queuing things up, now waiting for results queue to drain 13273 1726853320.68984: waiting for pending results... 13273 1726853320.69161: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree 13273 1726853320.69264: in run() - task 02083763-bbaf-5fc3-657d-000000000851 13273 1726853320.69277: variable 'ansible_search_path' from source: unknown 13273 1726853320.69280: variable 'ansible_search_path' from source: unknown 13273 1726853320.69307: calling self._execute() 13273 1726853320.69388: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853320.69392: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853320.69401: variable 'omit' from source: magic vars 13273 1726853320.69678: variable 'ansible_distribution_major_version' from source: facts 13273 1726853320.69687: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853320.69808: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13273 1726853320.70006: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13273 1726853320.70038: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13273 1726853320.70064: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13273 1726853320.70093: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13273 1726853320.70155: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13273 1726853320.70175: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13273 1726853320.70196: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853320.70214: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13273 1726853320.70276: variable '__network_is_ostree' from source: set_fact 13273 1726853320.70282: Evaluated conditional (not __network_is_ostree is defined): False 13273 1726853320.70285: when evaluation is False, skipping this task 13273 1726853320.70289: _execute() done 13273 1726853320.70292: dumping result to json 13273 1726853320.70296: done dumping result, returning 13273 1726853320.70306: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree [02083763-bbaf-5fc3-657d-000000000851] 13273 1726853320.70309: sending task result for task 02083763-bbaf-5fc3-657d-000000000851 13273 1726853320.70383: done sending task result for task 02083763-bbaf-5fc3-657d-000000000851 13273 1726853320.70386: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 13273 1726853320.70456: no more pending results, returning what we have 13273 1726853320.70460: results queue empty 13273 1726853320.70461: checking for any_errors_fatal 13273 1726853320.70469: done checking for any_errors_fatal 13273 1726853320.70470: checking for max_fail_percentage 13273 1726853320.70473: done checking for max_fail_percentage 13273 1726853320.70474: checking to see if all hosts have failed and the running result is not ok 13273 1726853320.70475: done checking to see if all hosts have failed 13273 1726853320.70476: getting the remaining hosts for this loop 13273 1726853320.70477: done getting the remaining hosts for this loop 13273 1726853320.70480: getting the next task for host managed_node3 13273 1726853320.70487: done getting next task for host managed_node3 13273 1726853320.70490: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 13273 1726853320.70493: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853320.70510: getting variables 13273 1726853320.70511: in VariableManager get_vars() 13273 1726853320.70554: Calling all_inventory to load vars for managed_node3 13273 1726853320.70556: Calling groups_inventory to load vars for managed_node3 13273 1726853320.70559: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853320.70565: Calling all_plugins_play to load vars for managed_node3 13273 1726853320.70568: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853320.70572: Calling groups_plugins_play to load vars for managed_node3 13273 1726853320.71400: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853320.72244: done with get_vars() 13273 1726853320.72258: done getting variables 13273 1726853320.72299: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 13:28:40 -0400 (0:00:00.036) 0:00:38.612 ****** 13273 1726853320.72325: entering _queue_task() for managed_node3/set_fact 13273 1726853320.72537: worker is 1 (out of 1 available) 13273 1726853320.72551: exiting _queue_task() for managed_node3/set_fact 13273 1726853320.72563: done queuing things up, now waiting for results queue to drain 13273 1726853320.72564: waiting for pending results... 13273 1726853320.72744: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 13273 1726853320.72837: in run() - task 02083763-bbaf-5fc3-657d-000000000852 13273 1726853320.72851: variable 'ansible_search_path' from source: unknown 13273 1726853320.72855: variable 'ansible_search_path' from source: unknown 13273 1726853320.72880: calling self._execute() 13273 1726853320.72955: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853320.72959: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853320.72968: variable 'omit' from source: magic vars 13273 1726853320.73242: variable 'ansible_distribution_major_version' from source: facts 13273 1726853320.73251: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853320.73365: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13273 1726853320.73556: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13273 1726853320.73588: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13273 1726853320.73611: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13273 1726853320.73636: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13273 1726853320.73700: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13273 1726853320.73717: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13273 1726853320.73735: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853320.73753: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13273 1726853320.73817: variable '__network_is_ostree' from source: set_fact 13273 1726853320.73823: Evaluated conditional (not __network_is_ostree is defined): False 13273 1726853320.73826: when evaluation is False, skipping this task 13273 1726853320.73829: _execute() done 13273 1726853320.73831: dumping result to json 13273 1726853320.73836: done dumping result, returning 13273 1726853320.73843: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [02083763-bbaf-5fc3-657d-000000000852] 13273 1726853320.73849: sending task result for task 02083763-bbaf-5fc3-657d-000000000852 13273 1726853320.73920: done sending task result for task 02083763-bbaf-5fc3-657d-000000000852 13273 1726853320.73923: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 13273 1726853320.73968: no more pending results, returning what we have 13273 1726853320.73974: results queue empty 13273 1726853320.73975: checking for any_errors_fatal 13273 1726853320.73980: done checking for any_errors_fatal 13273 1726853320.73981: checking for max_fail_percentage 13273 1726853320.73983: done checking for max_fail_percentage 13273 1726853320.73984: checking to see if all hosts have failed and the running result is not ok 13273 1726853320.73984: done checking to see if all hosts have failed 13273 1726853320.73985: getting the remaining hosts for this loop 13273 1726853320.73986: done getting the remaining hosts for this loop 13273 1726853320.73989: getting the next task for host managed_node3 13273 1726853320.73997: done getting next task for host managed_node3 13273 1726853320.74001: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 13273 1726853320.74005: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853320.74021: getting variables 13273 1726853320.74022: in VariableManager get_vars() 13273 1726853320.74065: Calling all_inventory to load vars for managed_node3 13273 1726853320.74067: Calling groups_inventory to load vars for managed_node3 13273 1726853320.74069: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853320.74084: Calling all_plugins_play to load vars for managed_node3 13273 1726853320.74088: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853320.74091: Calling groups_plugins_play to load vars for managed_node3 13273 1726853320.74825: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853320.76557: done with get_vars() 13273 1726853320.76585: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 13:28:40 -0400 (0:00:00.043) 0:00:38.655 ****** 13273 1726853320.76689: entering _queue_task() for managed_node3/service_facts 13273 1726853320.76998: worker is 1 (out of 1 available) 13273 1726853320.77019: exiting _queue_task() for managed_node3/service_facts 13273 1726853320.77032: done queuing things up, now waiting for results queue to drain 13273 1726853320.77034: waiting for pending results... 13273 1726853320.77487: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running 13273 1726853320.77515: in run() - task 02083763-bbaf-5fc3-657d-000000000854 13273 1726853320.77530: variable 'ansible_search_path' from source: unknown 13273 1726853320.77533: variable 'ansible_search_path' from source: unknown 13273 1726853320.77579: calling self._execute() 13273 1726853320.77683: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853320.77878: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853320.77882: variable 'omit' from source: magic vars 13273 1726853320.78097: variable 'ansible_distribution_major_version' from source: facts 13273 1726853320.78117: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853320.78124: variable 'omit' from source: magic vars 13273 1726853320.78193: variable 'omit' from source: magic vars 13273 1726853320.78238: variable 'omit' from source: magic vars 13273 1726853320.78276: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13273 1726853320.78313: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13273 1726853320.78341: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13273 1726853320.78358: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853320.78369: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853320.78402: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13273 1726853320.78405: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853320.78409: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853320.78514: Set connection var ansible_connection to ssh 13273 1726853320.78525: Set connection var ansible_module_compression to ZIP_DEFLATED 13273 1726853320.78530: Set connection var ansible_shell_executable to /bin/sh 13273 1726853320.78533: Set connection var ansible_shell_type to sh 13273 1726853320.78550: Set connection var ansible_pipelining to False 13273 1726853320.78556: Set connection var ansible_timeout to 10 13273 1726853320.78587: variable 'ansible_shell_executable' from source: unknown 13273 1726853320.78590: variable 'ansible_connection' from source: unknown 13273 1726853320.78594: variable 'ansible_module_compression' from source: unknown 13273 1726853320.78596: variable 'ansible_shell_type' from source: unknown 13273 1726853320.78598: variable 'ansible_shell_executable' from source: unknown 13273 1726853320.78600: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853320.78602: variable 'ansible_pipelining' from source: unknown 13273 1726853320.78607: variable 'ansible_timeout' from source: unknown 13273 1726853320.78609: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853320.78951: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 13273 1726853320.78956: variable 'omit' from source: magic vars 13273 1726853320.78958: starting attempt loop 13273 1726853320.78960: running the handler 13273 1726853320.78962: _low_level_execute_command(): starting 13273 1726853320.78964: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13273 1726853320.79751: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13273 1726853320.79954: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853320.79961: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853320.79965: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853320.79968: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853320.79970: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853320.79974: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853320.80105: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853320.81822: stdout chunk (state=3): >>>/root <<< 13273 1726853320.81922: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853320.81945: stderr chunk (state=3): >>><<< 13273 1726853320.81948: stdout chunk (state=3): >>><<< 13273 1726853320.81966: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853320.81978: _low_level_execute_command(): starting 13273 1726853320.81984: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853320.8196533-15062-68749360518499 `" && echo ansible-tmp-1726853320.8196533-15062-68749360518499="` echo /root/.ansible/tmp/ansible-tmp-1726853320.8196533-15062-68749360518499 `" ) && sleep 0' 13273 1726853320.82686: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853320.82716: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853320.82813: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853320.84751: stdout chunk (state=3): >>>ansible-tmp-1726853320.8196533-15062-68749360518499=/root/.ansible/tmp/ansible-tmp-1726853320.8196533-15062-68749360518499 <<< 13273 1726853320.84862: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853320.84880: stderr chunk (state=3): >>><<< 13273 1726853320.84883: stdout chunk (state=3): >>><<< 13273 1726853320.84900: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853320.8196533-15062-68749360518499=/root/.ansible/tmp/ansible-tmp-1726853320.8196533-15062-68749360518499 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853320.84936: variable 'ansible_module_compression' from source: unknown 13273 1726853320.84974: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13273pukop8ph/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 13273 1726853320.85005: variable 'ansible_facts' from source: unknown 13273 1726853320.85065: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853320.8196533-15062-68749360518499/AnsiballZ_service_facts.py 13273 1726853320.85166: Sending initial data 13273 1726853320.85170: Sent initial data (161 bytes) 13273 1726853320.85587: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13273 1726853320.85590: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853320.85593: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13273 1726853320.85595: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853320.85597: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853320.85600: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853320.85638: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853320.85653: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853320.85712: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853320.87332: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 13273 1726853320.87336: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13273 1726853320.87394: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13273 1726853320.87450: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13273pukop8ph/tmpck43dbd4 /root/.ansible/tmp/ansible-tmp-1726853320.8196533-15062-68749360518499/AnsiballZ_service_facts.py <<< 13273 1726853320.87453: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853320.8196533-15062-68749360518499/AnsiballZ_service_facts.py" <<< 13273 1726853320.87507: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13273pukop8ph/tmpck43dbd4" to remote "/root/.ansible/tmp/ansible-tmp-1726853320.8196533-15062-68749360518499/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853320.8196533-15062-68749360518499/AnsiballZ_service_facts.py" <<< 13273 1726853320.88123: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853320.88160: stderr chunk (state=3): >>><<< 13273 1726853320.88163: stdout chunk (state=3): >>><<< 13273 1726853320.88221: done transferring module to remote 13273 1726853320.88230: _low_level_execute_command(): starting 13273 1726853320.88235: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853320.8196533-15062-68749360518499/ /root/.ansible/tmp/ansible-tmp-1726853320.8196533-15062-68749360518499/AnsiballZ_service_facts.py && sleep 0' 13273 1726853320.88641: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853320.88644: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 13273 1726853320.88649: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853320.88652: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853320.88654: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853320.88699: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853320.88709: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853320.88764: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853320.90710: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853320.90715: stdout chunk (state=3): >>><<< 13273 1726853320.90717: stderr chunk (state=3): >>><<< 13273 1726853320.90732: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853320.90817: _low_level_execute_command(): starting 13273 1726853320.90821: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853320.8196533-15062-68749360518499/AnsiballZ_service_facts.py && sleep 0' 13273 1726853320.91385: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13273 1726853320.91399: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853320.91429: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853320.91538: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853320.91557: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853320.91581: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853320.91700: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853322.56513: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "st<<< 13273 1726853322.56550: stdout chunk (state=3): >>>opped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.<<< 13273 1726853322.56566: stdout chunk (state=3): >>>service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "st<<< 13273 1726853322.56570: stdout chunk (state=3): >>>opped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integratio<<< 13273 1726853322.56593: stdout chunk (state=3): >>>n.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-<<< 13273 1726853322.56600: stdout chunk (state=3): >>>boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 13273 1726853322.58364: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 13273 1726853322.58367: stdout chunk (state=3): >>><<< 13273 1726853322.58370: stderr chunk (state=3): >>><<< 13273 1726853322.58383: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 13273 1726853322.58870: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853320.8196533-15062-68749360518499/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13273 1726853322.58879: _low_level_execute_command(): starting 13273 1726853322.58884: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853320.8196533-15062-68749360518499/ > /dev/null 2>&1 && sleep 0' 13273 1726853322.59307: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853322.59310: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 13273 1726853322.59313: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853322.59315: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853322.59317: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853322.59367: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853322.59376: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853322.59434: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853322.61478: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853322.61481: stdout chunk (state=3): >>><<< 13273 1726853322.61484: stderr chunk (state=3): >>><<< 13273 1726853322.61486: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853322.61488: handler run complete 13273 1726853322.61621: variable 'ansible_facts' from source: unknown 13273 1726853322.61768: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853322.62155: variable 'ansible_facts' from source: unknown 13273 1726853322.62236: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853322.62350: attempt loop complete, returning result 13273 1726853322.62354: _execute() done 13273 1726853322.62358: dumping result to json 13273 1726853322.62394: done dumping result, returning 13273 1726853322.62402: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running [02083763-bbaf-5fc3-657d-000000000854] 13273 1726853322.62404: sending task result for task 02083763-bbaf-5fc3-657d-000000000854 ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 13273 1726853322.63014: no more pending results, returning what we have 13273 1726853322.63017: results queue empty 13273 1726853322.63018: checking for any_errors_fatal 13273 1726853322.63022: done checking for any_errors_fatal 13273 1726853322.63023: checking for max_fail_percentage 13273 1726853322.63025: done checking for max_fail_percentage 13273 1726853322.63025: checking to see if all hosts have failed and the running result is not ok 13273 1726853322.63026: done checking to see if all hosts have failed 13273 1726853322.63027: getting the remaining hosts for this loop 13273 1726853322.63028: done getting the remaining hosts for this loop 13273 1726853322.63031: getting the next task for host managed_node3 13273 1726853322.63036: done getting next task for host managed_node3 13273 1726853322.63039: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 13273 1726853322.63043: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853322.63055: getting variables 13273 1726853322.63056: in VariableManager get_vars() 13273 1726853322.63097: Calling all_inventory to load vars for managed_node3 13273 1726853322.63100: Calling groups_inventory to load vars for managed_node3 13273 1726853322.63101: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853322.63108: Calling all_plugins_play to load vars for managed_node3 13273 1726853322.63110: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853322.63113: Calling groups_plugins_play to load vars for managed_node3 13273 1726853322.63630: done sending task result for task 02083763-bbaf-5fc3-657d-000000000854 13273 1726853322.63633: WORKER PROCESS EXITING 13273 1726853322.63966: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853322.65206: done with get_vars() 13273 1726853322.65223: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 13:28:42 -0400 (0:00:01.886) 0:00:40.542 ****** 13273 1726853322.65302: entering _queue_task() for managed_node3/package_facts 13273 1726853322.65556: worker is 1 (out of 1 available) 13273 1726853322.65568: exiting _queue_task() for managed_node3/package_facts 13273 1726853322.65582: done queuing things up, now waiting for results queue to drain 13273 1726853322.65583: waiting for pending results... 13273 1726853322.65762: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed 13273 1726853322.65860: in run() - task 02083763-bbaf-5fc3-657d-000000000855 13273 1726853322.65872: variable 'ansible_search_path' from source: unknown 13273 1726853322.65877: variable 'ansible_search_path' from source: unknown 13273 1726853322.65904: calling self._execute() 13273 1726853322.65980: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853322.65986: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853322.65993: variable 'omit' from source: magic vars 13273 1726853322.66269: variable 'ansible_distribution_major_version' from source: facts 13273 1726853322.66280: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853322.66286: variable 'omit' from source: magic vars 13273 1726853322.66330: variable 'omit' from source: magic vars 13273 1726853322.66358: variable 'omit' from source: magic vars 13273 1726853322.66392: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13273 1726853322.66418: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13273 1726853322.66432: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13273 1726853322.66448: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853322.66458: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853322.66484: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13273 1726853322.66488: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853322.66490: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853322.66555: Set connection var ansible_connection to ssh 13273 1726853322.66564: Set connection var ansible_module_compression to ZIP_DEFLATED 13273 1726853322.66574: Set connection var ansible_shell_executable to /bin/sh 13273 1726853322.66576: Set connection var ansible_shell_type to sh 13273 1726853322.66579: Set connection var ansible_pipelining to False 13273 1726853322.66587: Set connection var ansible_timeout to 10 13273 1726853322.66605: variable 'ansible_shell_executable' from source: unknown 13273 1726853322.66608: variable 'ansible_connection' from source: unknown 13273 1726853322.66611: variable 'ansible_module_compression' from source: unknown 13273 1726853322.66613: variable 'ansible_shell_type' from source: unknown 13273 1726853322.66615: variable 'ansible_shell_executable' from source: unknown 13273 1726853322.66618: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853322.66623: variable 'ansible_pipelining' from source: unknown 13273 1726853322.66625: variable 'ansible_timeout' from source: unknown 13273 1726853322.66629: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853322.66775: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 13273 1726853322.66782: variable 'omit' from source: magic vars 13273 1726853322.66788: starting attempt loop 13273 1726853322.66791: running the handler 13273 1726853322.66804: _low_level_execute_command(): starting 13273 1726853322.66810: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13273 1726853322.67316: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853322.67319: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853322.67323: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13273 1726853322.67328: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853322.67376: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853322.67380: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853322.67397: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853322.67457: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853322.69177: stdout chunk (state=3): >>>/root <<< 13273 1726853322.69263: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853322.69297: stderr chunk (state=3): >>><<< 13273 1726853322.69300: stdout chunk (state=3): >>><<< 13273 1726853322.69322: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853322.69332: _low_level_execute_command(): starting 13273 1726853322.69338: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853322.6932032-15128-90348362605219 `" && echo ansible-tmp-1726853322.6932032-15128-90348362605219="` echo /root/.ansible/tmp/ansible-tmp-1726853322.6932032-15128-90348362605219 `" ) && sleep 0' 13273 1726853322.69793: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853322.69797: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 13273 1726853322.69799: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 13273 1726853322.69811: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 13273 1726853322.69814: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853322.69862: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853322.69866: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853322.69868: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853322.69924: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853322.71875: stdout chunk (state=3): >>>ansible-tmp-1726853322.6932032-15128-90348362605219=/root/.ansible/tmp/ansible-tmp-1726853322.6932032-15128-90348362605219 <<< 13273 1726853322.71985: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853322.72011: stderr chunk (state=3): >>><<< 13273 1726853322.72015: stdout chunk (state=3): >>><<< 13273 1726853322.72028: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853322.6932032-15128-90348362605219=/root/.ansible/tmp/ansible-tmp-1726853322.6932032-15128-90348362605219 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853322.72068: variable 'ansible_module_compression' from source: unknown 13273 1726853322.72110: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13273pukop8ph/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 13273 1726853322.72162: variable 'ansible_facts' from source: unknown 13273 1726853322.72286: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853322.6932032-15128-90348362605219/AnsiballZ_package_facts.py 13273 1726853322.72382: Sending initial data 13273 1726853322.72394: Sent initial data (161 bytes) 13273 1726853322.72834: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853322.72837: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 13273 1726853322.72840: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 13273 1726853322.72842: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853322.72847: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853322.72896: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853322.72900: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853322.72904: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853322.72965: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853322.74588: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 13273 1726853322.74592: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13273 1726853322.74646: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13273 1726853322.74710: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13273pukop8ph/tmpn4rbnge9 /root/.ansible/tmp/ansible-tmp-1726853322.6932032-15128-90348362605219/AnsiballZ_package_facts.py <<< 13273 1726853322.74716: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853322.6932032-15128-90348362605219/AnsiballZ_package_facts.py" <<< 13273 1726853322.74767: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13273pukop8ph/tmpn4rbnge9" to remote "/root/.ansible/tmp/ansible-tmp-1726853322.6932032-15128-90348362605219/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853322.6932032-15128-90348362605219/AnsiballZ_package_facts.py" <<< 13273 1726853322.75922: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853322.75961: stderr chunk (state=3): >>><<< 13273 1726853322.75964: stdout chunk (state=3): >>><<< 13273 1726853322.76008: done transferring module to remote 13273 1726853322.76015: _low_level_execute_command(): starting 13273 1726853322.76019: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853322.6932032-15128-90348362605219/ /root/.ansible/tmp/ansible-tmp-1726853322.6932032-15128-90348362605219/AnsiballZ_package_facts.py && sleep 0' 13273 1726853322.76453: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853322.76457: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853322.76459: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853322.76465: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853322.76509: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853322.76512: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853322.76577: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853322.78424: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853322.78451: stderr chunk (state=3): >>><<< 13273 1726853322.78454: stdout chunk (state=3): >>><<< 13273 1726853322.78464: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853322.78468: _low_level_execute_command(): starting 13273 1726853322.78474: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853322.6932032-15128-90348362605219/AnsiballZ_package_facts.py && sleep 0' 13273 1726853322.78919: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853322.78922: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853322.78926: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address <<< 13273 1726853322.78928: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853322.78931: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853322.78981: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853322.78984: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853322.79057: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853323.24129: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "rele<<< 13273 1726853323.24201: stdout chunk (state=3): >>>ase": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "sou<<< 13273 1726853323.24332: stdout chunk (state=3): >>>rce": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"<<< 13273 1726853323.24338: stdout chunk (state=3): >>>}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-resc<<< 13273 1726853323.24368: stdout chunk (state=3): >>>ue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 13273 1726853323.26120: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 13273 1726853323.26148: stderr chunk (state=3): >>><<< 13273 1726853323.26160: stdout chunk (state=3): >>><<< 13273 1726853323.26247: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 13273 1726853323.27828: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853322.6932032-15128-90348362605219/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13273 1726853323.27844: _low_level_execute_command(): starting 13273 1726853323.27918: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853322.6932032-15128-90348362605219/ > /dev/null 2>&1 && sleep 0' 13273 1726853323.28492: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853323.28512: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853323.28612: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853323.30474: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853323.30495: stderr chunk (state=3): >>><<< 13273 1726853323.30505: stdout chunk (state=3): >>><<< 13273 1726853323.30541: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853323.30545: handler run complete 13273 1726853323.31261: variable 'ansible_facts' from source: unknown 13273 1726853323.31778: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853323.33136: variable 'ansible_facts' from source: unknown 13273 1726853323.33543: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853323.34786: attempt loop complete, returning result 13273 1726853323.34805: _execute() done 13273 1726853323.34874: dumping result to json 13273 1726853323.35009: done dumping result, returning 13273 1726853323.35024: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed [02083763-bbaf-5fc3-657d-000000000855] 13273 1726853323.35036: sending task result for task 02083763-bbaf-5fc3-657d-000000000855 13273 1726853323.36689: done sending task result for task 02083763-bbaf-5fc3-657d-000000000855 13273 1726853323.36692: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 13273 1726853323.36778: no more pending results, returning what we have 13273 1726853323.36780: results queue empty 13273 1726853323.36780: checking for any_errors_fatal 13273 1726853323.36785: done checking for any_errors_fatal 13273 1726853323.36785: checking for max_fail_percentage 13273 1726853323.36786: done checking for max_fail_percentage 13273 1726853323.36787: checking to see if all hosts have failed and the running result is not ok 13273 1726853323.36787: done checking to see if all hosts have failed 13273 1726853323.36788: getting the remaining hosts for this loop 13273 1726853323.36789: done getting the remaining hosts for this loop 13273 1726853323.36791: getting the next task for host managed_node3 13273 1726853323.36795: done getting next task for host managed_node3 13273 1726853323.36797: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 13273 1726853323.36799: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853323.36805: getting variables 13273 1726853323.36806: in VariableManager get_vars() 13273 1726853323.36837: Calling all_inventory to load vars for managed_node3 13273 1726853323.36839: Calling groups_inventory to load vars for managed_node3 13273 1726853323.36840: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853323.36846: Calling all_plugins_play to load vars for managed_node3 13273 1726853323.36849: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853323.36850: Calling groups_plugins_play to load vars for managed_node3 13273 1726853323.37518: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853323.38880: done with get_vars() 13273 1726853323.38896: done getting variables 13273 1726853323.38943: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 13:28:43 -0400 (0:00:00.736) 0:00:41.278 ****** 13273 1726853323.38969: entering _queue_task() for managed_node3/debug 13273 1726853323.39198: worker is 1 (out of 1 available) 13273 1726853323.39211: exiting _queue_task() for managed_node3/debug 13273 1726853323.39223: done queuing things up, now waiting for results queue to drain 13273 1726853323.39223: waiting for pending results... 13273 1726853323.39405: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider 13273 1726853323.39491: in run() - task 02083763-bbaf-5fc3-657d-00000000011c 13273 1726853323.39502: variable 'ansible_search_path' from source: unknown 13273 1726853323.39506: variable 'ansible_search_path' from source: unknown 13273 1726853323.39534: calling self._execute() 13273 1726853323.39613: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853323.39617: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853323.39625: variable 'omit' from source: magic vars 13273 1726853323.39895: variable 'ansible_distribution_major_version' from source: facts 13273 1726853323.39905: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853323.39911: variable 'omit' from source: magic vars 13273 1726853323.39950: variable 'omit' from source: magic vars 13273 1726853323.40018: variable 'network_provider' from source: set_fact 13273 1726853323.40032: variable 'omit' from source: magic vars 13273 1726853323.40065: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13273 1726853323.40093: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13273 1726853323.40110: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13273 1726853323.40123: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853323.40133: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853323.40159: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13273 1726853323.40162: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853323.40164: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853323.40232: Set connection var ansible_connection to ssh 13273 1726853323.40240: Set connection var ansible_module_compression to ZIP_DEFLATED 13273 1726853323.40245: Set connection var ansible_shell_executable to /bin/sh 13273 1726853323.40250: Set connection var ansible_shell_type to sh 13273 1726853323.40256: Set connection var ansible_pipelining to False 13273 1726853323.40262: Set connection var ansible_timeout to 10 13273 1726853323.40283: variable 'ansible_shell_executable' from source: unknown 13273 1726853323.40286: variable 'ansible_connection' from source: unknown 13273 1726853323.40288: variable 'ansible_module_compression' from source: unknown 13273 1726853323.40291: variable 'ansible_shell_type' from source: unknown 13273 1726853323.40294: variable 'ansible_shell_executable' from source: unknown 13273 1726853323.40296: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853323.40298: variable 'ansible_pipelining' from source: unknown 13273 1726853323.40301: variable 'ansible_timeout' from source: unknown 13273 1726853323.40304: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853323.40405: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13273 1726853323.40413: variable 'omit' from source: magic vars 13273 1726853323.40418: starting attempt loop 13273 1726853323.40423: running the handler 13273 1726853323.40460: handler run complete 13273 1726853323.40470: attempt loop complete, returning result 13273 1726853323.40475: _execute() done 13273 1726853323.40478: dumping result to json 13273 1726853323.40481: done dumping result, returning 13273 1726853323.40487: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider [02083763-bbaf-5fc3-657d-00000000011c] 13273 1726853323.40491: sending task result for task 02083763-bbaf-5fc3-657d-00000000011c 13273 1726853323.40564: done sending task result for task 02083763-bbaf-5fc3-657d-00000000011c 13273 1726853323.40567: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: Using network provider: nm 13273 1726853323.40629: no more pending results, returning what we have 13273 1726853323.40632: results queue empty 13273 1726853323.40634: checking for any_errors_fatal 13273 1726853323.40642: done checking for any_errors_fatal 13273 1726853323.40642: checking for max_fail_percentage 13273 1726853323.40644: done checking for max_fail_percentage 13273 1726853323.40645: checking to see if all hosts have failed and the running result is not ok 13273 1726853323.40645: done checking to see if all hosts have failed 13273 1726853323.40646: getting the remaining hosts for this loop 13273 1726853323.40647: done getting the remaining hosts for this loop 13273 1726853323.40650: getting the next task for host managed_node3 13273 1726853323.40656: done getting next task for host managed_node3 13273 1726853323.40659: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 13273 1726853323.40661: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853323.40673: getting variables 13273 1726853323.40674: in VariableManager get_vars() 13273 1726853323.40716: Calling all_inventory to load vars for managed_node3 13273 1726853323.40719: Calling groups_inventory to load vars for managed_node3 13273 1726853323.40721: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853323.40728: Calling all_plugins_play to load vars for managed_node3 13273 1726853323.40731: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853323.40733: Calling groups_plugins_play to load vars for managed_node3 13273 1726853323.41454: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853323.42306: done with get_vars() 13273 1726853323.42321: done getting variables 13273 1726853323.42359: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 13:28:43 -0400 (0:00:00.034) 0:00:41.313 ****** 13273 1726853323.42384: entering _queue_task() for managed_node3/fail 13273 1726853323.42590: worker is 1 (out of 1 available) 13273 1726853323.42603: exiting _queue_task() for managed_node3/fail 13273 1726853323.42613: done queuing things up, now waiting for results queue to drain 13273 1726853323.42614: waiting for pending results... 13273 1726853323.42788: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 13273 1726853323.42882: in run() - task 02083763-bbaf-5fc3-657d-00000000011d 13273 1726853323.42894: variable 'ansible_search_path' from source: unknown 13273 1726853323.42897: variable 'ansible_search_path' from source: unknown 13273 1726853323.42922: calling self._execute() 13273 1726853323.42994: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853323.42999: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853323.43007: variable 'omit' from source: magic vars 13273 1726853323.43270: variable 'ansible_distribution_major_version' from source: facts 13273 1726853323.43281: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853323.43362: variable 'network_state' from source: role '' defaults 13273 1726853323.43372: Evaluated conditional (network_state != {}): False 13273 1726853323.43376: when evaluation is False, skipping this task 13273 1726853323.43379: _execute() done 13273 1726853323.43383: dumping result to json 13273 1726853323.43386: done dumping result, returning 13273 1726853323.43394: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [02083763-bbaf-5fc3-657d-00000000011d] 13273 1726853323.43397: sending task result for task 02083763-bbaf-5fc3-657d-00000000011d 13273 1726853323.43480: done sending task result for task 02083763-bbaf-5fc3-657d-00000000011d 13273 1726853323.43483: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 13273 1726853323.43537: no more pending results, returning what we have 13273 1726853323.43540: results queue empty 13273 1726853323.43541: checking for any_errors_fatal 13273 1726853323.43546: done checking for any_errors_fatal 13273 1726853323.43547: checking for max_fail_percentage 13273 1726853323.43548: done checking for max_fail_percentage 13273 1726853323.43549: checking to see if all hosts have failed and the running result is not ok 13273 1726853323.43550: done checking to see if all hosts have failed 13273 1726853323.43550: getting the remaining hosts for this loop 13273 1726853323.43552: done getting the remaining hosts for this loop 13273 1726853323.43555: getting the next task for host managed_node3 13273 1726853323.43559: done getting next task for host managed_node3 13273 1726853323.43563: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 13273 1726853323.43565: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853323.43582: getting variables 13273 1726853323.43584: in VariableManager get_vars() 13273 1726853323.43629: Calling all_inventory to load vars for managed_node3 13273 1726853323.43632: Calling groups_inventory to load vars for managed_node3 13273 1726853323.43634: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853323.43641: Calling all_plugins_play to load vars for managed_node3 13273 1726853323.43644: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853323.43646: Calling groups_plugins_play to load vars for managed_node3 13273 1726853323.44465: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853323.48672: done with get_vars() 13273 1726853323.48689: done getting variables 13273 1726853323.48733: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 13:28:43 -0400 (0:00:00.063) 0:00:41.376 ****** 13273 1726853323.48758: entering _queue_task() for managed_node3/fail 13273 1726853323.49013: worker is 1 (out of 1 available) 13273 1726853323.49027: exiting _queue_task() for managed_node3/fail 13273 1726853323.49045: done queuing things up, now waiting for results queue to drain 13273 1726853323.49046: waiting for pending results... 13273 1726853323.49218: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 13273 1726853323.49315: in run() - task 02083763-bbaf-5fc3-657d-00000000011e 13273 1726853323.49327: variable 'ansible_search_path' from source: unknown 13273 1726853323.49332: variable 'ansible_search_path' from source: unknown 13273 1726853323.49362: calling self._execute() 13273 1726853323.49441: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853323.49447: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853323.49457: variable 'omit' from source: magic vars 13273 1726853323.49741: variable 'ansible_distribution_major_version' from source: facts 13273 1726853323.49753: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853323.49846: variable 'network_state' from source: role '' defaults 13273 1726853323.49857: Evaluated conditional (network_state != {}): False 13273 1726853323.49860: when evaluation is False, skipping this task 13273 1726853323.49862: _execute() done 13273 1726853323.49865: dumping result to json 13273 1726853323.49867: done dumping result, returning 13273 1726853323.49875: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [02083763-bbaf-5fc3-657d-00000000011e] 13273 1726853323.49879: sending task result for task 02083763-bbaf-5fc3-657d-00000000011e 13273 1726853323.49960: done sending task result for task 02083763-bbaf-5fc3-657d-00000000011e 13273 1726853323.49963: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 13273 1726853323.50008: no more pending results, returning what we have 13273 1726853323.50011: results queue empty 13273 1726853323.50012: checking for any_errors_fatal 13273 1726853323.50019: done checking for any_errors_fatal 13273 1726853323.50020: checking for max_fail_percentage 13273 1726853323.50021: done checking for max_fail_percentage 13273 1726853323.50022: checking to see if all hosts have failed and the running result is not ok 13273 1726853323.50023: done checking to see if all hosts have failed 13273 1726853323.50023: getting the remaining hosts for this loop 13273 1726853323.50024: done getting the remaining hosts for this loop 13273 1726853323.50027: getting the next task for host managed_node3 13273 1726853323.50034: done getting next task for host managed_node3 13273 1726853323.50043: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 13273 1726853323.50046: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853323.50068: getting variables 13273 1726853323.50069: in VariableManager get_vars() 13273 1726853323.50248: Calling all_inventory to load vars for managed_node3 13273 1726853323.50251: Calling groups_inventory to load vars for managed_node3 13273 1726853323.50254: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853323.50262: Calling all_plugins_play to load vars for managed_node3 13273 1726853323.50266: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853323.50268: Calling groups_plugins_play to load vars for managed_node3 13273 1726853323.51382: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853323.52232: done with get_vars() 13273 1726853323.52250: done getting variables 13273 1726853323.52291: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 13:28:43 -0400 (0:00:00.035) 0:00:41.412 ****** 13273 1726853323.52314: entering _queue_task() for managed_node3/fail 13273 1726853323.52527: worker is 1 (out of 1 available) 13273 1726853323.52541: exiting _queue_task() for managed_node3/fail 13273 1726853323.52555: done queuing things up, now waiting for results queue to drain 13273 1726853323.52556: waiting for pending results... 13273 1726853323.52718: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 13273 1726853323.52805: in run() - task 02083763-bbaf-5fc3-657d-00000000011f 13273 1726853323.52816: variable 'ansible_search_path' from source: unknown 13273 1726853323.52819: variable 'ansible_search_path' from source: unknown 13273 1726853323.52848: calling self._execute() 13273 1726853323.52923: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853323.52928: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853323.52936: variable 'omit' from source: magic vars 13273 1726853323.53209: variable 'ansible_distribution_major_version' from source: facts 13273 1726853323.53219: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853323.53332: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13273 1726853323.54800: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13273 1726853323.55077: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13273 1726853323.55104: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13273 1726853323.55128: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13273 1726853323.55149: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13273 1726853323.55205: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853323.55225: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853323.55242: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853323.55269: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853323.55282: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853323.55343: variable 'ansible_distribution_major_version' from source: facts 13273 1726853323.55358: Evaluated conditional (ansible_distribution_major_version | int > 9): True 13273 1726853323.55435: variable 'ansible_distribution' from source: facts 13273 1726853323.55439: variable '__network_rh_distros' from source: role '' defaults 13273 1726853323.55447: Evaluated conditional (ansible_distribution in __network_rh_distros): True 13273 1726853323.55606: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853323.55623: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853323.55639: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853323.55666: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853323.55679: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853323.55714: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853323.55729: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853323.55745: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853323.55773: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853323.55784: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853323.55812: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853323.55831: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853323.55847: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853323.55874: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853323.55884: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853323.56266: variable 'network_connections' from source: task vars 13273 1726853323.56270: variable 'port1_profile' from source: play vars 13273 1726853323.56275: variable 'port1_profile' from source: play vars 13273 1726853323.56277: variable 'port2_profile' from source: play vars 13273 1726853323.56283: variable 'port2_profile' from source: play vars 13273 1726853323.56286: variable 'network_state' from source: role '' defaults 13273 1726853323.56350: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13273 1726853323.56538: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13273 1726853323.56579: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13273 1726853323.56618: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13273 1726853323.56663: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13273 1726853323.56712: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13273 1726853323.56753: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13273 1726853323.56785: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853323.56817: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13273 1726853323.56855: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 13273 1726853323.56862: when evaluation is False, skipping this task 13273 1726853323.56868: _execute() done 13273 1726853323.56879: dumping result to json 13273 1726853323.56887: done dumping result, returning 13273 1726853323.56899: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [02083763-bbaf-5fc3-657d-00000000011f] 13273 1726853323.56908: sending task result for task 02083763-bbaf-5fc3-657d-00000000011f skipping: [managed_node3] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 13273 1726853323.57075: no more pending results, returning what we have 13273 1726853323.57078: results queue empty 13273 1726853323.57079: checking for any_errors_fatal 13273 1726853323.57083: done checking for any_errors_fatal 13273 1726853323.57084: checking for max_fail_percentage 13273 1726853323.57085: done checking for max_fail_percentage 13273 1726853323.57086: checking to see if all hosts have failed and the running result is not ok 13273 1726853323.57087: done checking to see if all hosts have failed 13273 1726853323.57087: getting the remaining hosts for this loop 13273 1726853323.57089: done getting the remaining hosts for this loop 13273 1726853323.57092: getting the next task for host managed_node3 13273 1726853323.57097: done getting next task for host managed_node3 13273 1726853323.57100: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 13273 1726853323.57103: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853323.57121: getting variables 13273 1726853323.57123: in VariableManager get_vars() 13273 1726853323.57176: Calling all_inventory to load vars for managed_node3 13273 1726853323.57180: Calling groups_inventory to load vars for managed_node3 13273 1726853323.57182: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853323.57191: Calling all_plugins_play to load vars for managed_node3 13273 1726853323.57195: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853323.57198: Calling groups_plugins_play to load vars for managed_node3 13273 1726853323.57785: done sending task result for task 02083763-bbaf-5fc3-657d-00000000011f 13273 1726853323.57788: WORKER PROCESS EXITING 13273 1726853323.58120: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853323.59081: done with get_vars() 13273 1726853323.59102: done getting variables 13273 1726853323.59154: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 13:28:43 -0400 (0:00:00.068) 0:00:41.480 ****** 13273 1726853323.59186: entering _queue_task() for managed_node3/dnf 13273 1726853323.59483: worker is 1 (out of 1 available) 13273 1726853323.59495: exiting _queue_task() for managed_node3/dnf 13273 1726853323.59506: done queuing things up, now waiting for results queue to drain 13273 1726853323.59507: waiting for pending results... 13273 1726853323.59928: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 13273 1726853323.59934: in run() - task 02083763-bbaf-5fc3-657d-000000000120 13273 1726853323.59938: variable 'ansible_search_path' from source: unknown 13273 1726853323.59941: variable 'ansible_search_path' from source: unknown 13273 1726853323.59944: calling self._execute() 13273 1726853323.60029: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853323.60041: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853323.60055: variable 'omit' from source: magic vars 13273 1726853323.60434: variable 'ansible_distribution_major_version' from source: facts 13273 1726853323.60450: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853323.60648: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13273 1726853323.62859: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13273 1726853323.62940: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13273 1726853323.62985: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13273 1726853323.63027: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13273 1726853323.63127: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13273 1726853323.63130: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853323.63160: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853323.63188: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853323.63237: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853323.63262: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853323.63391: variable 'ansible_distribution' from source: facts 13273 1726853323.63401: variable 'ansible_distribution_major_version' from source: facts 13273 1726853323.63423: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 13273 1726853323.63544: variable '__network_wireless_connections_defined' from source: role '' defaults 13273 1726853323.63690: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853323.63719: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853323.63752: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853323.63876: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853323.63881: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853323.63884: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853323.63893: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853323.63999: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853323.64003: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853323.64005: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853323.64032: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853323.64064: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853323.64096: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853323.64144: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853323.64168: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853323.64341: variable 'network_connections' from source: task vars 13273 1726853323.64363: variable 'port1_profile' from source: play vars 13273 1726853323.64429: variable 'port1_profile' from source: play vars 13273 1726853323.64451: variable 'port2_profile' from source: play vars 13273 1726853323.64513: variable 'port2_profile' from source: play vars 13273 1726853323.64593: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13273 1726853323.64873: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13273 1726853323.64876: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13273 1726853323.64878: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13273 1726853323.64891: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13273 1726853323.64937: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13273 1726853323.64967: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13273 1726853323.65010: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853323.65076: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13273 1726853323.65102: variable '__network_team_connections_defined' from source: role '' defaults 13273 1726853323.65354: variable 'network_connections' from source: task vars 13273 1726853323.65364: variable 'port1_profile' from source: play vars 13273 1726853323.65459: variable 'port1_profile' from source: play vars 13273 1726853323.65474: variable 'port2_profile' from source: play vars 13273 1726853323.65641: variable 'port2_profile' from source: play vars 13273 1726853323.65644: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 13273 1726853323.65649: when evaluation is False, skipping this task 13273 1726853323.65652: _execute() done 13273 1726853323.65655: dumping result to json 13273 1726853323.65657: done dumping result, returning 13273 1726853323.65659: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [02083763-bbaf-5fc3-657d-000000000120] 13273 1726853323.65661: sending task result for task 02083763-bbaf-5fc3-657d-000000000120 13273 1726853323.65731: done sending task result for task 02083763-bbaf-5fc3-657d-000000000120 13273 1726853323.65734: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 13273 1726853323.65801: no more pending results, returning what we have 13273 1726853323.65804: results queue empty 13273 1726853323.65805: checking for any_errors_fatal 13273 1726853323.65813: done checking for any_errors_fatal 13273 1726853323.65814: checking for max_fail_percentage 13273 1726853323.65815: done checking for max_fail_percentage 13273 1726853323.65817: checking to see if all hosts have failed and the running result is not ok 13273 1726853323.65817: done checking to see if all hosts have failed 13273 1726853323.65818: getting the remaining hosts for this loop 13273 1726853323.65819: done getting the remaining hosts for this loop 13273 1726853323.65824: getting the next task for host managed_node3 13273 1726853323.65831: done getting next task for host managed_node3 13273 1726853323.65836: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 13273 1726853323.65839: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853323.65863: getting variables 13273 1726853323.65865: in VariableManager get_vars() 13273 1726853323.65922: Calling all_inventory to load vars for managed_node3 13273 1726853323.65925: Calling groups_inventory to load vars for managed_node3 13273 1726853323.65928: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853323.65938: Calling all_plugins_play to load vars for managed_node3 13273 1726853323.65942: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853323.65948: Calling groups_plugins_play to load vars for managed_node3 13273 1726853323.67223: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853323.68092: done with get_vars() 13273 1726853323.68107: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 13273 1726853323.68160: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 13:28:43 -0400 (0:00:00.089) 0:00:41.571 ****** 13273 1726853323.68185: entering _queue_task() for managed_node3/yum 13273 1726853323.68416: worker is 1 (out of 1 available) 13273 1726853323.68430: exiting _queue_task() for managed_node3/yum 13273 1726853323.68452: done queuing things up, now waiting for results queue to drain 13273 1726853323.68454: waiting for pending results... 13273 1726853323.68792: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 13273 1726853323.68806: in run() - task 02083763-bbaf-5fc3-657d-000000000121 13273 1726853323.68825: variable 'ansible_search_path' from source: unknown 13273 1726853323.68834: variable 'ansible_search_path' from source: unknown 13273 1726853323.68881: calling self._execute() 13273 1726853323.68994: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853323.69010: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853323.69025: variable 'omit' from source: magic vars 13273 1726853323.69436: variable 'ansible_distribution_major_version' from source: facts 13273 1726853323.69527: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853323.69622: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13273 1726853323.71351: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13273 1726853323.71396: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13273 1726853323.71457: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13273 1726853323.71475: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13273 1726853323.71498: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13273 1726853323.71776: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853323.71779: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853323.71782: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853323.71784: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853323.71787: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853323.71789: variable 'ansible_distribution_major_version' from source: facts 13273 1726853323.71808: Evaluated conditional (ansible_distribution_major_version | int < 8): False 13273 1726853323.71814: when evaluation is False, skipping this task 13273 1726853323.71820: _execute() done 13273 1726853323.71827: dumping result to json 13273 1726853323.71833: done dumping result, returning 13273 1726853323.71844: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [02083763-bbaf-5fc3-657d-000000000121] 13273 1726853323.71859: sending task result for task 02083763-bbaf-5fc3-657d-000000000121 skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 13273 1726853323.72017: no more pending results, returning what we have 13273 1726853323.72020: results queue empty 13273 1726853323.72021: checking for any_errors_fatal 13273 1726853323.72027: done checking for any_errors_fatal 13273 1726853323.72028: checking for max_fail_percentage 13273 1726853323.72030: done checking for max_fail_percentage 13273 1726853323.72031: checking to see if all hosts have failed and the running result is not ok 13273 1726853323.72032: done checking to see if all hosts have failed 13273 1726853323.72032: getting the remaining hosts for this loop 13273 1726853323.72033: done getting the remaining hosts for this loop 13273 1726853323.72036: getting the next task for host managed_node3 13273 1726853323.72042: done getting next task for host managed_node3 13273 1726853323.72047: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 13273 1726853323.72050: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853323.72072: getting variables 13273 1726853323.72073: in VariableManager get_vars() 13273 1726853323.72122: Calling all_inventory to load vars for managed_node3 13273 1726853323.72124: Calling groups_inventory to load vars for managed_node3 13273 1726853323.72126: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853323.72135: Calling all_plugins_play to load vars for managed_node3 13273 1726853323.72138: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853323.72141: Calling groups_plugins_play to load vars for managed_node3 13273 1726853323.72684: done sending task result for task 02083763-bbaf-5fc3-657d-000000000121 13273 1726853323.72688: WORKER PROCESS EXITING 13273 1726853323.73484: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853323.74339: done with get_vars() 13273 1726853323.74356: done getting variables 13273 1726853323.74401: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 13:28:43 -0400 (0:00:00.062) 0:00:41.633 ****** 13273 1726853323.74424: entering _queue_task() for managed_node3/fail 13273 1726853323.74738: worker is 1 (out of 1 available) 13273 1726853323.74752: exiting _queue_task() for managed_node3/fail 13273 1726853323.74764: done queuing things up, now waiting for results queue to drain 13273 1726853323.74765: waiting for pending results... 13273 1726853323.75056: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 13273 1726853323.75215: in run() - task 02083763-bbaf-5fc3-657d-000000000122 13273 1726853323.75225: variable 'ansible_search_path' from source: unknown 13273 1726853323.75251: variable 'ansible_search_path' from source: unknown 13273 1726853323.75292: calling self._execute() 13273 1726853323.75373: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853323.75563: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853323.75567: variable 'omit' from source: magic vars 13273 1726853323.75812: variable 'ansible_distribution_major_version' from source: facts 13273 1726853323.75831: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853323.75955: variable '__network_wireless_connections_defined' from source: role '' defaults 13273 1726853323.76158: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13273 1726853323.77642: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13273 1726853323.77694: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13273 1726853323.77720: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13273 1726853323.77747: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13273 1726853323.77765: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13273 1726853323.77821: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853323.77841: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853323.77859: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853323.77890: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853323.77900: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853323.77931: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853323.77949: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853323.77989: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853323.78059: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853323.78062: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853323.78064: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853323.78089: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853323.78124: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853323.78205: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853323.78208: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853323.78327: variable 'network_connections' from source: task vars 13273 1726853323.78344: variable 'port1_profile' from source: play vars 13273 1726853323.78401: variable 'port1_profile' from source: play vars 13273 1726853323.78476: variable 'port2_profile' from source: play vars 13273 1726853323.78480: variable 'port2_profile' from source: play vars 13273 1726853323.78531: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13273 1726853323.78696: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13273 1726853323.78730: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13273 1726853323.78767: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13273 1726853323.78804: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13273 1726853323.78876: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13273 1726853323.78880: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13273 1726853323.78906: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853323.78938: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13273 1726853323.78996: variable '__network_team_connections_defined' from source: role '' defaults 13273 1726853323.79177: variable 'network_connections' from source: task vars 13273 1726853323.79181: variable 'port1_profile' from source: play vars 13273 1726853323.79231: variable 'port1_profile' from source: play vars 13273 1726853323.79237: variable 'port2_profile' from source: play vars 13273 1726853323.79284: variable 'port2_profile' from source: play vars 13273 1726853323.79302: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 13273 1726853323.79305: when evaluation is False, skipping this task 13273 1726853323.79307: _execute() done 13273 1726853323.79310: dumping result to json 13273 1726853323.79312: done dumping result, returning 13273 1726853323.79319: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [02083763-bbaf-5fc3-657d-000000000122] 13273 1726853323.79330: sending task result for task 02083763-bbaf-5fc3-657d-000000000122 skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 13273 1726853323.79457: no more pending results, returning what we have 13273 1726853323.79461: results queue empty 13273 1726853323.79462: checking for any_errors_fatal 13273 1726853323.79468: done checking for any_errors_fatal 13273 1726853323.79468: checking for max_fail_percentage 13273 1726853323.79470: done checking for max_fail_percentage 13273 1726853323.79472: checking to see if all hosts have failed and the running result is not ok 13273 1726853323.79473: done checking to see if all hosts have failed 13273 1726853323.79473: getting the remaining hosts for this loop 13273 1726853323.79474: done getting the remaining hosts for this loop 13273 1726853323.79477: getting the next task for host managed_node3 13273 1726853323.79483: done getting next task for host managed_node3 13273 1726853323.79486: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 13273 1726853323.79489: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853323.79509: getting variables 13273 1726853323.79510: in VariableManager get_vars() 13273 1726853323.79559: Calling all_inventory to load vars for managed_node3 13273 1726853323.79562: Calling groups_inventory to load vars for managed_node3 13273 1726853323.79564: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853323.79578: Calling all_plugins_play to load vars for managed_node3 13273 1726853323.79581: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853323.79585: done sending task result for task 02083763-bbaf-5fc3-657d-000000000122 13273 1726853323.79588: WORKER PROCESS EXITING 13273 1726853323.79591: Calling groups_plugins_play to load vars for managed_node3 13273 1726853323.80365: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853323.81324: done with get_vars() 13273 1726853323.81338: done getting variables 13273 1726853323.81382: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 13:28:43 -0400 (0:00:00.069) 0:00:41.703 ****** 13273 1726853323.81406: entering _queue_task() for managed_node3/package 13273 1726853323.81619: worker is 1 (out of 1 available) 13273 1726853323.81632: exiting _queue_task() for managed_node3/package 13273 1726853323.81644: done queuing things up, now waiting for results queue to drain 13273 1726853323.81645: waiting for pending results... 13273 1726853323.81819: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages 13273 1726853323.81902: in run() - task 02083763-bbaf-5fc3-657d-000000000123 13273 1726853323.81913: variable 'ansible_search_path' from source: unknown 13273 1726853323.81917: variable 'ansible_search_path' from source: unknown 13273 1726853323.81944: calling self._execute() 13273 1726853323.82023: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853323.82026: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853323.82036: variable 'omit' from source: magic vars 13273 1726853323.82309: variable 'ansible_distribution_major_version' from source: facts 13273 1726853323.82319: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853323.82450: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13273 1726853323.82637: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13273 1726853323.82673: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13273 1726853323.82699: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13273 1726853323.82750: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13273 1726853323.82829: variable 'network_packages' from source: role '' defaults 13273 1726853323.82903: variable '__network_provider_setup' from source: role '' defaults 13273 1726853323.82912: variable '__network_service_name_default_nm' from source: role '' defaults 13273 1726853323.82960: variable '__network_service_name_default_nm' from source: role '' defaults 13273 1726853323.82966: variable '__network_packages_default_nm' from source: role '' defaults 13273 1726853323.83011: variable '__network_packages_default_nm' from source: role '' defaults 13273 1726853323.83127: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13273 1726853323.84441: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13273 1726853323.84488: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13273 1726853323.84515: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13273 1726853323.84539: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13273 1726853323.84560: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13273 1726853323.84627: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853323.84647: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853323.84666: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853323.84697: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853323.84705: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853323.84736: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853323.84754: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853323.84772: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853323.84795: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853323.84807: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853323.84947: variable '__network_packages_default_gobject_packages' from source: role '' defaults 13273 1726853323.85018: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853323.85036: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853323.85055: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853323.85081: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853323.85091: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853323.85152: variable 'ansible_python' from source: facts 13273 1726853323.85173: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 13273 1726853323.85225: variable '__network_wpa_supplicant_required' from source: role '' defaults 13273 1726853323.85283: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 13273 1726853323.85364: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853323.85382: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853323.85399: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853323.85423: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853323.85433: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853323.85469: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853323.85490: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853323.85506: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853323.85529: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853323.85539: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853323.85635: variable 'network_connections' from source: task vars 13273 1726853323.85641: variable 'port1_profile' from source: play vars 13273 1726853323.85713: variable 'port1_profile' from source: play vars 13273 1726853323.85722: variable 'port2_profile' from source: play vars 13273 1726853323.85792: variable 'port2_profile' from source: play vars 13273 1726853323.85841: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13273 1726853323.85862: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13273 1726853323.85884: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853323.85908: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13273 1726853323.85944: variable '__network_wireless_connections_defined' from source: role '' defaults 13273 1726853323.86123: variable 'network_connections' from source: task vars 13273 1726853323.86126: variable 'port1_profile' from source: play vars 13273 1726853323.86197: variable 'port1_profile' from source: play vars 13273 1726853323.86205: variable 'port2_profile' from source: play vars 13273 1726853323.86276: variable 'port2_profile' from source: play vars 13273 1726853323.86298: variable '__network_packages_default_wireless' from source: role '' defaults 13273 1726853323.86354: variable '__network_wireless_connections_defined' from source: role '' defaults 13273 1726853323.86544: variable 'network_connections' from source: task vars 13273 1726853323.86547: variable 'port1_profile' from source: play vars 13273 1726853323.86595: variable 'port1_profile' from source: play vars 13273 1726853323.86601: variable 'port2_profile' from source: play vars 13273 1726853323.86645: variable 'port2_profile' from source: play vars 13273 1726853323.86775: variable '__network_packages_default_team' from source: role '' defaults 13273 1726853323.86778: variable '__network_team_connections_defined' from source: role '' defaults 13273 1726853323.86911: variable 'network_connections' from source: task vars 13273 1726853323.86914: variable 'port1_profile' from source: play vars 13273 1726853323.86958: variable 'port1_profile' from source: play vars 13273 1726853323.86964: variable 'port2_profile' from source: play vars 13273 1726853323.87011: variable 'port2_profile' from source: play vars 13273 1726853323.87050: variable '__network_service_name_default_initscripts' from source: role '' defaults 13273 1726853323.87093: variable '__network_service_name_default_initscripts' from source: role '' defaults 13273 1726853323.87097: variable '__network_packages_default_initscripts' from source: role '' defaults 13273 1726853323.87138: variable '__network_packages_default_initscripts' from source: role '' defaults 13273 1726853323.87269: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 13273 1726853323.87568: variable 'network_connections' from source: task vars 13273 1726853323.87573: variable 'port1_profile' from source: play vars 13273 1726853323.87614: variable 'port1_profile' from source: play vars 13273 1726853323.87620: variable 'port2_profile' from source: play vars 13273 1726853323.87663: variable 'port2_profile' from source: play vars 13273 1726853323.87670: variable 'ansible_distribution' from source: facts 13273 1726853323.87674: variable '__network_rh_distros' from source: role '' defaults 13273 1726853323.87680: variable 'ansible_distribution_major_version' from source: facts 13273 1726853323.87691: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 13273 1726853323.87795: variable 'ansible_distribution' from source: facts 13273 1726853323.87799: variable '__network_rh_distros' from source: role '' defaults 13273 1726853323.87802: variable 'ansible_distribution_major_version' from source: facts 13273 1726853323.87813: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 13273 1726853323.87919: variable 'ansible_distribution' from source: facts 13273 1726853323.87922: variable '__network_rh_distros' from source: role '' defaults 13273 1726853323.87926: variable 'ansible_distribution_major_version' from source: facts 13273 1726853323.87952: variable 'network_provider' from source: set_fact 13273 1726853323.87964: variable 'ansible_facts' from source: unknown 13273 1726853323.88381: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 13273 1726853323.88385: when evaluation is False, skipping this task 13273 1726853323.88387: _execute() done 13273 1726853323.88390: dumping result to json 13273 1726853323.88392: done dumping result, returning 13273 1726853323.88402: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages [02083763-bbaf-5fc3-657d-000000000123] 13273 1726853323.88405: sending task result for task 02083763-bbaf-5fc3-657d-000000000123 13273 1726853323.88491: done sending task result for task 02083763-bbaf-5fc3-657d-000000000123 13273 1726853323.88494: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 13273 1726853323.88554: no more pending results, returning what we have 13273 1726853323.88557: results queue empty 13273 1726853323.88558: checking for any_errors_fatal 13273 1726853323.88564: done checking for any_errors_fatal 13273 1726853323.88565: checking for max_fail_percentage 13273 1726853323.88567: done checking for max_fail_percentage 13273 1726853323.88568: checking to see if all hosts have failed and the running result is not ok 13273 1726853323.88569: done checking to see if all hosts have failed 13273 1726853323.88569: getting the remaining hosts for this loop 13273 1726853323.88572: done getting the remaining hosts for this loop 13273 1726853323.88575: getting the next task for host managed_node3 13273 1726853323.88582: done getting next task for host managed_node3 13273 1726853323.88585: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 13273 1726853323.88588: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853323.88614: getting variables 13273 1726853323.88615: in VariableManager get_vars() 13273 1726853323.88666: Calling all_inventory to load vars for managed_node3 13273 1726853323.88668: Calling groups_inventory to load vars for managed_node3 13273 1726853323.88675: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853323.88684: Calling all_plugins_play to load vars for managed_node3 13273 1726853323.88686: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853323.88689: Calling groups_plugins_play to load vars for managed_node3 13273 1726853323.90136: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853323.91779: done with get_vars() 13273 1726853323.91805: done getting variables 13273 1726853323.91867: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 13:28:43 -0400 (0:00:00.105) 0:00:41.808 ****** 13273 1726853323.91914: entering _queue_task() for managed_node3/package 13273 1726853323.92243: worker is 1 (out of 1 available) 13273 1726853323.92258: exiting _queue_task() for managed_node3/package 13273 1726853323.92270: done queuing things up, now waiting for results queue to drain 13273 1726853323.92476: waiting for pending results... 13273 1726853323.92605: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 13273 1726853323.92714: in run() - task 02083763-bbaf-5fc3-657d-000000000124 13273 1726853323.92742: variable 'ansible_search_path' from source: unknown 13273 1726853323.92755: variable 'ansible_search_path' from source: unknown 13273 1726853323.92810: calling self._execute() 13273 1726853323.92920: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853323.93029: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853323.93033: variable 'omit' from source: magic vars 13273 1726853323.93351: variable 'ansible_distribution_major_version' from source: facts 13273 1726853323.93374: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853323.93504: variable 'network_state' from source: role '' defaults 13273 1726853323.93519: Evaluated conditional (network_state != {}): False 13273 1726853323.93527: when evaluation is False, skipping this task 13273 1726853323.93577: _execute() done 13273 1726853323.93581: dumping result to json 13273 1726853323.93583: done dumping result, returning 13273 1726853323.93586: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [02083763-bbaf-5fc3-657d-000000000124] 13273 1726853323.93588: sending task result for task 02083763-bbaf-5fc3-657d-000000000124 skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 13273 1726853323.93730: no more pending results, returning what we have 13273 1726853323.93734: results queue empty 13273 1726853323.93735: checking for any_errors_fatal 13273 1726853323.93748: done checking for any_errors_fatal 13273 1726853323.93749: checking for max_fail_percentage 13273 1726853323.93751: done checking for max_fail_percentage 13273 1726853323.93752: checking to see if all hosts have failed and the running result is not ok 13273 1726853323.93753: done checking to see if all hosts have failed 13273 1726853323.93754: getting the remaining hosts for this loop 13273 1726853323.93755: done getting the remaining hosts for this loop 13273 1726853323.93758: getting the next task for host managed_node3 13273 1726853323.93766: done getting next task for host managed_node3 13273 1726853323.93773: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 13273 1726853323.93777: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853323.93801: getting variables 13273 1726853323.93803: in VariableManager get_vars() 13273 1726853323.93861: Calling all_inventory to load vars for managed_node3 13273 1726853323.93864: Calling groups_inventory to load vars for managed_node3 13273 1726853323.93867: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853323.93982: Calling all_plugins_play to load vars for managed_node3 13273 1726853323.93986: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853323.94177: Calling groups_plugins_play to load vars for managed_node3 13273 1726853323.94884: done sending task result for task 02083763-bbaf-5fc3-657d-000000000124 13273 1726853323.94887: WORKER PROCESS EXITING 13273 1726853323.95668: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853323.97228: done with get_vars() 13273 1726853323.97252: done getting variables 13273 1726853323.97311: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 13:28:43 -0400 (0:00:00.054) 0:00:41.862 ****** 13273 1726853323.97343: entering _queue_task() for managed_node3/package 13273 1726853323.97663: worker is 1 (out of 1 available) 13273 1726853323.97676: exiting _queue_task() for managed_node3/package 13273 1726853323.97688: done queuing things up, now waiting for results queue to drain 13273 1726853323.97689: waiting for pending results... 13273 1726853323.97977: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 13273 1726853323.98122: in run() - task 02083763-bbaf-5fc3-657d-000000000125 13273 1726853323.98141: variable 'ansible_search_path' from source: unknown 13273 1726853323.98153: variable 'ansible_search_path' from source: unknown 13273 1726853323.98195: calling self._execute() 13273 1726853323.98304: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853323.98325: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853323.98341: variable 'omit' from source: magic vars 13273 1726853323.98737: variable 'ansible_distribution_major_version' from source: facts 13273 1726853323.98762: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853323.98897: variable 'network_state' from source: role '' defaults 13273 1726853323.98911: Evaluated conditional (network_state != {}): False 13273 1726853323.98919: when evaluation is False, skipping this task 13273 1726853323.98926: _execute() done 13273 1726853323.98970: dumping result to json 13273 1726853323.98974: done dumping result, returning 13273 1726853323.98978: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [02083763-bbaf-5fc3-657d-000000000125] 13273 1726853323.98980: sending task result for task 02083763-bbaf-5fc3-657d-000000000125 skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 13273 1726853323.99118: no more pending results, returning what we have 13273 1726853323.99122: results queue empty 13273 1726853323.99123: checking for any_errors_fatal 13273 1726853323.99128: done checking for any_errors_fatal 13273 1726853323.99129: checking for max_fail_percentage 13273 1726853323.99131: done checking for max_fail_percentage 13273 1726853323.99132: checking to see if all hosts have failed and the running result is not ok 13273 1726853323.99133: done checking to see if all hosts have failed 13273 1726853323.99133: getting the remaining hosts for this loop 13273 1726853323.99135: done getting the remaining hosts for this loop 13273 1726853323.99138: getting the next task for host managed_node3 13273 1726853323.99148: done getting next task for host managed_node3 13273 1726853323.99152: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 13273 1726853323.99156: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853323.99181: getting variables 13273 1726853323.99183: in VariableManager get_vars() 13273 1726853323.99236: Calling all_inventory to load vars for managed_node3 13273 1726853323.99240: Calling groups_inventory to load vars for managed_node3 13273 1726853323.99243: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853323.99256: Calling all_plugins_play to load vars for managed_node3 13273 1726853323.99260: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853323.99263: Calling groups_plugins_play to load vars for managed_node3 13273 1726853324.00075: done sending task result for task 02083763-bbaf-5fc3-657d-000000000125 13273 1726853324.00078: WORKER PROCESS EXITING 13273 1726853324.00847: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853324.02403: done with get_vars() 13273 1726853324.02423: done getting variables 13273 1726853324.02485: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 13:28:44 -0400 (0:00:00.051) 0:00:41.914 ****** 13273 1726853324.02520: entering _queue_task() for managed_node3/service 13273 1726853324.02817: worker is 1 (out of 1 available) 13273 1726853324.02829: exiting _queue_task() for managed_node3/service 13273 1726853324.02840: done queuing things up, now waiting for results queue to drain 13273 1726853324.02841: waiting for pending results... 13273 1726853324.03127: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 13273 1726853324.03296: in run() - task 02083763-bbaf-5fc3-657d-000000000126 13273 1726853324.03300: variable 'ansible_search_path' from source: unknown 13273 1726853324.03303: variable 'ansible_search_path' from source: unknown 13273 1726853324.03335: calling self._execute() 13273 1726853324.03676: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853324.03680: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853324.03683: variable 'omit' from source: magic vars 13273 1726853324.03865: variable 'ansible_distribution_major_version' from source: facts 13273 1726853324.03885: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853324.04016: variable '__network_wireless_connections_defined' from source: role '' defaults 13273 1726853324.04223: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13273 1726853324.06776: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13273 1726853324.06844: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13273 1726853324.06900: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13273 1726853324.06936: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13273 1726853324.06970: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13273 1726853324.07100: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853324.07134: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853324.07168: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853324.07211: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853324.07227: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853324.07279: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853324.07304: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853324.07328: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853324.07370: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853324.07392: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853324.07434: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853324.07464: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853324.07495: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853324.07533: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853324.07551: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853324.07718: variable 'network_connections' from source: task vars 13273 1726853324.07734: variable 'port1_profile' from source: play vars 13273 1726853324.07803: variable 'port1_profile' from source: play vars 13273 1726853324.07876: variable 'port2_profile' from source: play vars 13273 1726853324.07885: variable 'port2_profile' from source: play vars 13273 1726853324.07959: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13273 1726853324.08125: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13273 1726853324.08173: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13273 1726853324.08207: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13273 1726853324.08256: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13273 1726853324.08302: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13273 1726853324.08329: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13273 1726853324.08365: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853324.08397: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13273 1726853324.08464: variable '__network_team_connections_defined' from source: role '' defaults 13273 1726853324.09004: variable 'network_connections' from source: task vars 13273 1726853324.09007: variable 'port1_profile' from source: play vars 13273 1726853324.09053: variable 'port1_profile' from source: play vars 13273 1726853324.09088: variable 'port2_profile' from source: play vars 13273 1726853324.09175: variable 'port2_profile' from source: play vars 13273 1726853324.09223: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 13273 1726853324.09233: when evaluation is False, skipping this task 13273 1726853324.09257: _execute() done 13273 1726853324.09266: dumping result to json 13273 1726853324.09276: done dumping result, returning 13273 1726853324.09289: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [02083763-bbaf-5fc3-657d-000000000126] 13273 1726853324.09308: sending task result for task 02083763-bbaf-5fc3-657d-000000000126 skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 13273 1726853324.09486: no more pending results, returning what we have 13273 1726853324.09489: results queue empty 13273 1726853324.09490: checking for any_errors_fatal 13273 1726853324.09496: done checking for any_errors_fatal 13273 1726853324.09496: checking for max_fail_percentage 13273 1726853324.09498: done checking for max_fail_percentage 13273 1726853324.09499: checking to see if all hosts have failed and the running result is not ok 13273 1726853324.09500: done checking to see if all hosts have failed 13273 1726853324.09501: getting the remaining hosts for this loop 13273 1726853324.09502: done getting the remaining hosts for this loop 13273 1726853324.09506: getting the next task for host managed_node3 13273 1726853324.09513: done getting next task for host managed_node3 13273 1726853324.09517: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 13273 1726853324.09520: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853324.09541: getting variables 13273 1726853324.09543: in VariableManager get_vars() 13273 1726853324.09603: Calling all_inventory to load vars for managed_node3 13273 1726853324.09606: Calling groups_inventory to load vars for managed_node3 13273 1726853324.09608: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853324.09618: Calling all_plugins_play to load vars for managed_node3 13273 1726853324.09622: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853324.09624: Calling groups_plugins_play to load vars for managed_node3 13273 1726853324.10323: done sending task result for task 02083763-bbaf-5fc3-657d-000000000126 13273 1726853324.10326: WORKER PROCESS EXITING 13273 1726853324.11596: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853324.14620: done with get_vars() 13273 1726853324.14649: done getting variables 13273 1726853324.14710: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 13:28:44 -0400 (0:00:00.122) 0:00:42.036 ****** 13273 1726853324.14743: entering _queue_task() for managed_node3/service 13273 1726853324.15481: worker is 1 (out of 1 available) 13273 1726853324.15494: exiting _queue_task() for managed_node3/service 13273 1726853324.15506: done queuing things up, now waiting for results queue to drain 13273 1726853324.15507: waiting for pending results... 13273 1726853324.15994: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 13273 1726853324.16305: in run() - task 02083763-bbaf-5fc3-657d-000000000127 13273 1726853324.16495: variable 'ansible_search_path' from source: unknown 13273 1726853324.16499: variable 'ansible_search_path' from source: unknown 13273 1726853324.16502: calling self._execute() 13273 1726853324.16664: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853324.16679: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853324.16724: variable 'omit' from source: magic vars 13273 1726853324.17500: variable 'ansible_distribution_major_version' from source: facts 13273 1726853324.17687: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853324.17868: variable 'network_provider' from source: set_fact 13273 1726853324.17911: variable 'network_state' from source: role '' defaults 13273 1726853324.17925: Evaluated conditional (network_provider == "nm" or network_state != {}): True 13273 1726853324.17984: variable 'omit' from source: magic vars 13273 1726853324.18044: variable 'omit' from source: magic vars 13273 1726853324.18153: variable 'network_service_name' from source: role '' defaults 13273 1726853324.18338: variable 'network_service_name' from source: role '' defaults 13273 1726853324.18558: variable '__network_provider_setup' from source: role '' defaults 13273 1726853324.18570: variable '__network_service_name_default_nm' from source: role '' defaults 13273 1726853324.18634: variable '__network_service_name_default_nm' from source: role '' defaults 13273 1726853324.18676: variable '__network_packages_default_nm' from source: role '' defaults 13273 1726853324.18878: variable '__network_packages_default_nm' from source: role '' defaults 13273 1726853324.19203: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13273 1726853324.23819: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13273 1726853324.23928: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13273 1726853324.24050: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13273 1726853324.24092: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13273 1726853324.24127: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13273 1726853324.24209: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853324.24252: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853324.24286: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853324.24330: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853324.24358: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853324.24409: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853324.24437: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853324.24475: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853324.24517: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853324.24556: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853324.24794: variable '__network_packages_default_gobject_packages' from source: role '' defaults 13273 1726853324.24917: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853324.24948: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853324.24991: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853324.25100: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853324.25103: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853324.25141: variable 'ansible_python' from source: facts 13273 1726853324.25173: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 13273 1726853324.25262: variable '__network_wpa_supplicant_required' from source: role '' defaults 13273 1726853324.25349: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 13273 1726853324.25484: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853324.25513: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853324.25548: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853324.25593: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853324.25610: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853324.25665: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853324.25754: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853324.25757: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853324.25776: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853324.25795: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853324.25944: variable 'network_connections' from source: task vars 13273 1726853324.25960: variable 'port1_profile' from source: play vars 13273 1726853324.26039: variable 'port1_profile' from source: play vars 13273 1726853324.26059: variable 'port2_profile' from source: play vars 13273 1726853324.26135: variable 'port2_profile' from source: play vars 13273 1726853324.26295: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13273 1726853324.26461: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13273 1726853324.26518: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13273 1726853324.26565: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13273 1726853324.26616: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13273 1726853324.26685: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13273 1726853324.26717: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13273 1726853324.26759: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853324.26840: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13273 1726853324.26855: variable '__network_wireless_connections_defined' from source: role '' defaults 13273 1726853324.27153: variable 'network_connections' from source: task vars 13273 1726853324.27168: variable 'port1_profile' from source: play vars 13273 1726853324.27242: variable 'port1_profile' from source: play vars 13273 1726853324.27261: variable 'port2_profile' from source: play vars 13273 1726853324.27336: variable 'port2_profile' from source: play vars 13273 1726853324.27385: variable '__network_packages_default_wireless' from source: role '' defaults 13273 1726853324.27464: variable '__network_wireless_connections_defined' from source: role '' defaults 13273 1726853324.27819: variable 'network_connections' from source: task vars 13273 1726853324.27823: variable 'port1_profile' from source: play vars 13273 1726853324.27859: variable 'port1_profile' from source: play vars 13273 1726853324.27874: variable 'port2_profile' from source: play vars 13273 1726853324.27950: variable 'port2_profile' from source: play vars 13273 1726853324.27979: variable '__network_packages_default_team' from source: role '' defaults 13273 1726853324.28063: variable '__network_team_connections_defined' from source: role '' defaults 13273 1726853324.28375: variable 'network_connections' from source: task vars 13273 1726853324.28475: variable 'port1_profile' from source: play vars 13273 1726853324.28478: variable 'port1_profile' from source: play vars 13273 1726853324.28481: variable 'port2_profile' from source: play vars 13273 1726853324.28539: variable 'port2_profile' from source: play vars 13273 1726853324.28618: variable '__network_service_name_default_initscripts' from source: role '' defaults 13273 1726853324.28685: variable '__network_service_name_default_initscripts' from source: role '' defaults 13273 1726853324.28700: variable '__network_packages_default_initscripts' from source: role '' defaults 13273 1726853324.28762: variable '__network_packages_default_initscripts' from source: role '' defaults 13273 1726853324.28976: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 13273 1726853324.30023: variable 'network_connections' from source: task vars 13273 1726853324.30179: variable 'port1_profile' from source: play vars 13273 1726853324.30183: variable 'port1_profile' from source: play vars 13273 1726853324.30213: variable 'port2_profile' from source: play vars 13273 1726853324.30279: variable 'port2_profile' from source: play vars 13273 1726853324.30539: variable 'ansible_distribution' from source: facts 13273 1726853324.30542: variable '__network_rh_distros' from source: role '' defaults 13273 1726853324.30544: variable 'ansible_distribution_major_version' from source: facts 13273 1726853324.30549: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 13273 1726853324.30743: variable 'ansible_distribution' from source: facts 13273 1726853324.30870: variable '__network_rh_distros' from source: role '' defaults 13273 1726853324.30978: variable 'ansible_distribution_major_version' from source: facts 13273 1726853324.30981: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 13273 1726853324.31178: variable 'ansible_distribution' from source: facts 13273 1726853324.31201: variable '__network_rh_distros' from source: role '' defaults 13273 1726853324.31212: variable 'ansible_distribution_major_version' from source: facts 13273 1726853324.31318: variable 'network_provider' from source: set_fact 13273 1726853324.31350: variable 'omit' from source: magic vars 13273 1726853324.31444: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13273 1726853324.31482: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13273 1726853324.31541: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13273 1726853324.31654: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853324.31672: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853324.31707: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13273 1726853324.31715: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853324.31724: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853324.31939: Set connection var ansible_connection to ssh 13273 1726853324.31963: Set connection var ansible_module_compression to ZIP_DEFLATED 13273 1726853324.32178: Set connection var ansible_shell_executable to /bin/sh 13273 1726853324.32181: Set connection var ansible_shell_type to sh 13273 1726853324.32183: Set connection var ansible_pipelining to False 13273 1726853324.32186: Set connection var ansible_timeout to 10 13273 1726853324.32188: variable 'ansible_shell_executable' from source: unknown 13273 1726853324.32190: variable 'ansible_connection' from source: unknown 13273 1726853324.32192: variable 'ansible_module_compression' from source: unknown 13273 1726853324.32195: variable 'ansible_shell_type' from source: unknown 13273 1726853324.32201: variable 'ansible_shell_executable' from source: unknown 13273 1726853324.32203: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853324.32205: variable 'ansible_pipelining' from source: unknown 13273 1726853324.32207: variable 'ansible_timeout' from source: unknown 13273 1726853324.32209: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853324.32362: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13273 1726853324.32408: variable 'omit' from source: magic vars 13273 1726853324.32485: starting attempt loop 13273 1726853324.32493: running the handler 13273 1726853324.32621: variable 'ansible_facts' from source: unknown 13273 1726853324.34289: _low_level_execute_command(): starting 13273 1726853324.34301: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13273 1726853324.35783: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853324.35801: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853324.35896: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853324.37619: stdout chunk (state=3): >>>/root <<< 13273 1726853324.37709: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853324.37744: stderr chunk (state=3): >>><<< 13273 1726853324.37756: stdout chunk (state=3): >>><<< 13273 1726853324.38012: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853324.38015: _low_level_execute_command(): starting 13273 1726853324.38018: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853324.3780413-15203-93286077269124 `" && echo ansible-tmp-1726853324.3780413-15203-93286077269124="` echo /root/.ansible/tmp/ansible-tmp-1726853324.3780413-15203-93286077269124 `" ) && sleep 0' 13273 1726853324.39005: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853324.39009: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 13273 1726853324.39012: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 13273 1726853324.39016: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853324.39018: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853324.39154: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853324.39305: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853324.39369: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853324.41340: stdout chunk (state=3): >>>ansible-tmp-1726853324.3780413-15203-93286077269124=/root/.ansible/tmp/ansible-tmp-1726853324.3780413-15203-93286077269124 <<< 13273 1726853324.41517: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853324.41566: stderr chunk (state=3): >>><<< 13273 1726853324.41569: stdout chunk (state=3): >>><<< 13273 1726853324.41591: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853324.3780413-15203-93286077269124=/root/.ansible/tmp/ansible-tmp-1726853324.3780413-15203-93286077269124 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853324.41876: variable 'ansible_module_compression' from source: unknown 13273 1726853324.41880: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13273pukop8ph/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 13273 1726853324.41940: variable 'ansible_facts' from source: unknown 13273 1726853324.42357: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853324.3780413-15203-93286077269124/AnsiballZ_systemd.py 13273 1726853324.42778: Sending initial data 13273 1726853324.42782: Sent initial data (155 bytes) 13273 1726853324.43981: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13273 1726853324.43996: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853324.44221: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853324.44241: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853324.44282: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853324.44343: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853324.45989: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13273 1726853324.46050: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13273 1726853324.46110: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13273pukop8ph/tmpw88n9c9l /root/.ansible/tmp/ansible-tmp-1726853324.3780413-15203-93286077269124/AnsiballZ_systemd.py <<< 13273 1726853324.46114: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853324.3780413-15203-93286077269124/AnsiballZ_systemd.py" <<< 13273 1726853324.46186: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13273pukop8ph/tmpw88n9c9l" to remote "/root/.ansible/tmp/ansible-tmp-1726853324.3780413-15203-93286077269124/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853324.3780413-15203-93286077269124/AnsiballZ_systemd.py" <<< 13273 1726853324.48991: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853324.49112: stderr chunk (state=3): >>><<< 13273 1726853324.49122: stdout chunk (state=3): >>><<< 13273 1726853324.49279: done transferring module to remote 13273 1726853324.49283: _low_level_execute_command(): starting 13273 1726853324.49285: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853324.3780413-15203-93286077269124/ /root/.ansible/tmp/ansible-tmp-1726853324.3780413-15203-93286077269124/AnsiballZ_systemd.py && sleep 0' 13273 1726853324.50434: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853324.50449: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853324.50462: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853324.50514: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853324.50526: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853324.50611: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853324.50697: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853324.52795: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853324.52891: stderr chunk (state=3): >>><<< 13273 1726853324.52894: stdout chunk (state=3): >>><<< 13273 1726853324.52897: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853324.52899: _low_level_execute_command(): starting 13273 1726853324.52901: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853324.3780413-15203-93286077269124/AnsiballZ_systemd.py && sleep 0' 13273 1726853324.53969: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853324.53987: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 13273 1726853324.54067: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853324.54307: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853324.54374: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853324.84161: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "705", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:21:20 EDT", "ExecMainStartTimestampMonotonic": "24298536", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:21:20 EDT", "ExecMainHandoffTimestampMonotonic": "24318182", "ExecMainPID": "705", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10567680", "MemoryPeak": "14114816", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3310772224", "EffectiveMemoryMax": "3702865920", "EffectiveMemoryHigh": "3702865920", "CPUUsageNSec": "1205207000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpR<<< 13273 1726853324.84194: stdout chunk (state=3): >>>eceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice sysinit.target dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "network.target shutdown.target cloud-init.service NetworkManager-wait-online.service multi-user.target", "After": "sysinit.target systemd-journald.socket basic.target cloud-init-local.service network-pre.target dbus.socket system.slice dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:26:57 EDT", "StateChangeTimestampMonotonic": "361843458", "InactiveExitTimestamp": "Fri 2024-09-20 13:21:20 EDT", "InactiveExitTimestampMonotonic": "24299070", "ActiveEnterTimestamp": "Fri 2024-09-20 13:21:20 EDT", "ActiveEnterTimestampMonotonic": "24855925", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:21:20 EDT", "ConditionTimestampMonotonic": "24297535", "AssertTimestamp": "Fri 2024-09-20 13:21:20 EDT", "AssertTimestampMonotonic": "24297537", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "125a1bdc44cb4bffa8aeca788d2f2fa3", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 13273 1726853324.86327: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 13273 1726853324.86331: stdout chunk (state=3): >>><<< 13273 1726853324.86334: stderr chunk (state=3): >>><<< 13273 1726853324.86578: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "705", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:21:20 EDT", "ExecMainStartTimestampMonotonic": "24298536", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:21:20 EDT", "ExecMainHandoffTimestampMonotonic": "24318182", "ExecMainPID": "705", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10567680", "MemoryPeak": "14114816", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3310772224", "EffectiveMemoryMax": "3702865920", "EffectiveMemoryHigh": "3702865920", "CPUUsageNSec": "1205207000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice sysinit.target dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "network.target shutdown.target cloud-init.service NetworkManager-wait-online.service multi-user.target", "After": "sysinit.target systemd-journald.socket basic.target cloud-init-local.service network-pre.target dbus.socket system.slice dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:26:57 EDT", "StateChangeTimestampMonotonic": "361843458", "InactiveExitTimestamp": "Fri 2024-09-20 13:21:20 EDT", "InactiveExitTimestampMonotonic": "24299070", "ActiveEnterTimestamp": "Fri 2024-09-20 13:21:20 EDT", "ActiveEnterTimestampMonotonic": "24855925", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:21:20 EDT", "ConditionTimestampMonotonic": "24297535", "AssertTimestamp": "Fri 2024-09-20 13:21:20 EDT", "AssertTimestampMonotonic": "24297537", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "125a1bdc44cb4bffa8aeca788d2f2fa3", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 13273 1726853324.86760: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853324.3780413-15203-93286077269124/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13273 1726853324.86833: _low_level_execute_command(): starting 13273 1726853324.86978: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853324.3780413-15203-93286077269124/ > /dev/null 2>&1 && sleep 0' 13273 1726853324.87728: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853324.87735: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13273 1726853324.87753: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 13273 1726853324.87759: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853324.87787: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration <<< 13273 1726853324.87790: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853324.87863: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853324.87869: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853324.87922: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853324.88004: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853324.90183: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853324.90187: stdout chunk (state=3): >>><<< 13273 1726853324.90190: stderr chunk (state=3): >>><<< 13273 1726853324.90192: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853324.90194: handler run complete 13273 1726853324.90196: attempt loop complete, returning result 13273 1726853324.90198: _execute() done 13273 1726853324.90200: dumping result to json 13273 1726853324.90201: done dumping result, returning 13273 1726853324.90203: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [02083763-bbaf-5fc3-657d-000000000127] 13273 1726853324.90205: sending task result for task 02083763-bbaf-5fc3-657d-000000000127 13273 1726853324.91007: done sending task result for task 02083763-bbaf-5fc3-657d-000000000127 13273 1726853324.91010: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 13273 1726853324.91076: no more pending results, returning what we have 13273 1726853324.91080: results queue empty 13273 1726853324.91081: checking for any_errors_fatal 13273 1726853324.91088: done checking for any_errors_fatal 13273 1726853324.91089: checking for max_fail_percentage 13273 1726853324.91091: done checking for max_fail_percentage 13273 1726853324.91092: checking to see if all hosts have failed and the running result is not ok 13273 1726853324.91093: done checking to see if all hosts have failed 13273 1726853324.91093: getting the remaining hosts for this loop 13273 1726853324.91095: done getting the remaining hosts for this loop 13273 1726853324.91098: getting the next task for host managed_node3 13273 1726853324.91104: done getting next task for host managed_node3 13273 1726853324.91107: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 13273 1726853324.91110: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853324.91121: getting variables 13273 1726853324.91122: in VariableManager get_vars() 13273 1726853324.91453: Calling all_inventory to load vars for managed_node3 13273 1726853324.91457: Calling groups_inventory to load vars for managed_node3 13273 1726853324.91460: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853324.91468: Calling all_plugins_play to load vars for managed_node3 13273 1726853324.91474: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853324.91477: Calling groups_plugins_play to load vars for managed_node3 13273 1726853324.93035: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853324.94822: done with get_vars() 13273 1726853324.94847: done getting variables 13273 1726853324.94915: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 13:28:44 -0400 (0:00:00.802) 0:00:42.838 ****** 13273 1726853324.94955: entering _queue_task() for managed_node3/service 13273 1726853324.95334: worker is 1 (out of 1 available) 13273 1726853324.95350: exiting _queue_task() for managed_node3/service 13273 1726853324.95363: done queuing things up, now waiting for results queue to drain 13273 1726853324.95365: waiting for pending results... 13273 1726853324.95890: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 13273 1726853324.95896: in run() - task 02083763-bbaf-5fc3-657d-000000000128 13273 1726853324.95900: variable 'ansible_search_path' from source: unknown 13273 1726853324.95902: variable 'ansible_search_path' from source: unknown 13273 1726853324.95905: calling self._execute() 13273 1726853324.96005: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853324.96011: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853324.96044: variable 'omit' from source: magic vars 13273 1726853324.96443: variable 'ansible_distribution_major_version' from source: facts 13273 1726853324.96458: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853324.96578: variable 'network_provider' from source: set_fact 13273 1726853324.96584: Evaluated conditional (network_provider == "nm"): True 13273 1726853324.96684: variable '__network_wpa_supplicant_required' from source: role '' defaults 13273 1726853324.96976: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 13273 1726853324.96980: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13273 1726853325.01318: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13273 1726853325.01390: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13273 1726853325.01419: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13273 1726853325.01444: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13273 1726853325.01469: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13273 1726853325.01544: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853325.01567: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853325.01587: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853325.01611: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853325.01622: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853325.01658: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853325.01677: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853325.01694: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853325.01717: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853325.01728: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853325.01759: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853325.01777: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853325.01793: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853325.01816: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853325.01827: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853325.01925: variable 'network_connections' from source: task vars 13273 1726853325.01936: variable 'port1_profile' from source: play vars 13273 1726853325.01987: variable 'port1_profile' from source: play vars 13273 1726853325.01997: variable 'port2_profile' from source: play vars 13273 1726853325.02038: variable 'port2_profile' from source: play vars 13273 1726853325.02090: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13273 1726853325.02201: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13273 1726853325.02230: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13273 1726853325.02254: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13273 1726853325.02277: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13273 1726853325.02307: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13273 1726853325.02323: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13273 1726853325.02339: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853325.02359: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13273 1726853325.02399: variable '__network_wireless_connections_defined' from source: role '' defaults 13273 1726853325.02562: variable 'network_connections' from source: task vars 13273 1726853325.02565: variable 'port1_profile' from source: play vars 13273 1726853325.02611: variable 'port1_profile' from source: play vars 13273 1726853325.02614: variable 'port2_profile' from source: play vars 13273 1726853325.02659: variable 'port2_profile' from source: play vars 13273 1726853325.02682: Evaluated conditional (__network_wpa_supplicant_required): False 13273 1726853325.02685: when evaluation is False, skipping this task 13273 1726853325.02696: _execute() done 13273 1726853325.02700: dumping result to json 13273 1726853325.02702: done dumping result, returning 13273 1726853325.02705: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [02083763-bbaf-5fc3-657d-000000000128] 13273 1726853325.02707: sending task result for task 02083763-bbaf-5fc3-657d-000000000128 13273 1726853325.02790: done sending task result for task 02083763-bbaf-5fc3-657d-000000000128 13273 1726853325.02792: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 13273 1726853325.02867: no more pending results, returning what we have 13273 1726853325.02872: results queue empty 13273 1726853325.02874: checking for any_errors_fatal 13273 1726853325.02900: done checking for any_errors_fatal 13273 1726853325.02902: checking for max_fail_percentage 13273 1726853325.02904: done checking for max_fail_percentage 13273 1726853325.02905: checking to see if all hosts have failed and the running result is not ok 13273 1726853325.02905: done checking to see if all hosts have failed 13273 1726853325.02906: getting the remaining hosts for this loop 13273 1726853325.02907: done getting the remaining hosts for this loop 13273 1726853325.02910: getting the next task for host managed_node3 13273 1726853325.02917: done getting next task for host managed_node3 13273 1726853325.02920: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 13273 1726853325.02923: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853325.02942: getting variables 13273 1726853325.02943: in VariableManager get_vars() 13273 1726853325.02989: Calling all_inventory to load vars for managed_node3 13273 1726853325.02992: Calling groups_inventory to load vars for managed_node3 13273 1726853325.02994: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853325.03002: Calling all_plugins_play to load vars for managed_node3 13273 1726853325.03004: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853325.03007: Calling groups_plugins_play to load vars for managed_node3 13273 1726853325.04196: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853325.05119: done with get_vars() 13273 1726853325.05134: done getting variables 13273 1726853325.05178: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 13:28:45 -0400 (0:00:00.102) 0:00:42.941 ****** 13273 1726853325.05200: entering _queue_task() for managed_node3/service 13273 1726853325.05419: worker is 1 (out of 1 available) 13273 1726853325.05434: exiting _queue_task() for managed_node3/service 13273 1726853325.05446: done queuing things up, now waiting for results queue to drain 13273 1726853325.05447: waiting for pending results... 13273 1726853325.05626: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service 13273 1726853325.05709: in run() - task 02083763-bbaf-5fc3-657d-000000000129 13273 1726853325.05721: variable 'ansible_search_path' from source: unknown 13273 1726853325.05726: variable 'ansible_search_path' from source: unknown 13273 1726853325.05755: calling self._execute() 13273 1726853325.05832: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853325.05836: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853325.05844: variable 'omit' from source: magic vars 13273 1726853325.06117: variable 'ansible_distribution_major_version' from source: facts 13273 1726853325.06126: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853325.06204: variable 'network_provider' from source: set_fact 13273 1726853325.06209: Evaluated conditional (network_provider == "initscripts"): False 13273 1726853325.06212: when evaluation is False, skipping this task 13273 1726853325.06215: _execute() done 13273 1726853325.06217: dumping result to json 13273 1726853325.06219: done dumping result, returning 13273 1726853325.06227: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service [02083763-bbaf-5fc3-657d-000000000129] 13273 1726853325.06233: sending task result for task 02083763-bbaf-5fc3-657d-000000000129 13273 1726853325.06317: done sending task result for task 02083763-bbaf-5fc3-657d-000000000129 13273 1726853325.06320: WORKER PROCESS EXITING skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 13273 1726853325.06374: no more pending results, returning what we have 13273 1726853325.06378: results queue empty 13273 1726853325.06379: checking for any_errors_fatal 13273 1726853325.06385: done checking for any_errors_fatal 13273 1726853325.06386: checking for max_fail_percentage 13273 1726853325.06387: done checking for max_fail_percentage 13273 1726853325.06388: checking to see if all hosts have failed and the running result is not ok 13273 1726853325.06389: done checking to see if all hosts have failed 13273 1726853325.06390: getting the remaining hosts for this loop 13273 1726853325.06391: done getting the remaining hosts for this loop 13273 1726853325.06394: getting the next task for host managed_node3 13273 1726853325.06399: done getting next task for host managed_node3 13273 1726853325.06402: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 13273 1726853325.06405: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853325.06422: getting variables 13273 1726853325.06423: in VariableManager get_vars() 13273 1726853325.06461: Calling all_inventory to load vars for managed_node3 13273 1726853325.06464: Calling groups_inventory to load vars for managed_node3 13273 1726853325.06466: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853325.06475: Calling all_plugins_play to load vars for managed_node3 13273 1726853325.06477: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853325.06480: Calling groups_plugins_play to load vars for managed_node3 13273 1726853325.07308: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853325.08206: done with get_vars() 13273 1726853325.08220: done getting variables 13273 1726853325.08262: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 13:28:45 -0400 (0:00:00.030) 0:00:42.972 ****** 13273 1726853325.08289: entering _queue_task() for managed_node3/copy 13273 1726853325.08675: worker is 1 (out of 1 available) 13273 1726853325.08685: exiting _queue_task() for managed_node3/copy 13273 1726853325.08696: done queuing things up, now waiting for results queue to drain 13273 1726853325.08697: waiting for pending results... 13273 1726853325.08982: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 13273 1726853325.09056: in run() - task 02083763-bbaf-5fc3-657d-00000000012a 13273 1726853325.09097: variable 'ansible_search_path' from source: unknown 13273 1726853325.09106: variable 'ansible_search_path' from source: unknown 13273 1726853325.09148: calling self._execute() 13273 1726853325.09260: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853325.09277: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853325.09295: variable 'omit' from source: magic vars 13273 1726853325.09674: variable 'ansible_distribution_major_version' from source: facts 13273 1726853325.09692: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853325.09825: variable 'network_provider' from source: set_fact 13273 1726853325.09829: Evaluated conditional (network_provider == "initscripts"): False 13273 1726853325.09838: when evaluation is False, skipping this task 13273 1726853325.09842: _execute() done 13273 1726853325.09848: dumping result to json 13273 1726853325.09853: done dumping result, returning 13273 1726853325.09864: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [02083763-bbaf-5fc3-657d-00000000012a] 13273 1726853325.09888: sending task result for task 02083763-bbaf-5fc3-657d-00000000012a skipping: [managed_node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 13273 1726853325.10019: no more pending results, returning what we have 13273 1726853325.10023: results queue empty 13273 1726853325.10024: checking for any_errors_fatal 13273 1726853325.10031: done checking for any_errors_fatal 13273 1726853325.10031: checking for max_fail_percentage 13273 1726853325.10033: done checking for max_fail_percentage 13273 1726853325.10034: checking to see if all hosts have failed and the running result is not ok 13273 1726853325.10034: done checking to see if all hosts have failed 13273 1726853325.10035: getting the remaining hosts for this loop 13273 1726853325.10036: done getting the remaining hosts for this loop 13273 1726853325.10039: getting the next task for host managed_node3 13273 1726853325.10045: done getting next task for host managed_node3 13273 1726853325.10049: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 13273 1726853325.10053: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853325.10078: getting variables 13273 1726853325.10079: in VariableManager get_vars() 13273 1726853325.10123: Calling all_inventory to load vars for managed_node3 13273 1726853325.10126: Calling groups_inventory to load vars for managed_node3 13273 1726853325.10128: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853325.10136: Calling all_plugins_play to load vars for managed_node3 13273 1726853325.10138: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853325.10140: Calling groups_plugins_play to load vars for managed_node3 13273 1726853325.10683: done sending task result for task 02083763-bbaf-5fc3-657d-00000000012a 13273 1726853325.10687: WORKER PROCESS EXITING 13273 1726853325.10921: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853325.11840: done with get_vars() 13273 1726853325.11859: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 13:28:45 -0400 (0:00:00.036) 0:00:43.008 ****** 13273 1726853325.11929: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 13273 1726853325.12204: worker is 1 (out of 1 available) 13273 1726853325.12216: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 13273 1726853325.12229: done queuing things up, now waiting for results queue to drain 13273 1726853325.12230: waiting for pending results... 13273 1726853325.12698: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 13273 1726853325.12703: in run() - task 02083763-bbaf-5fc3-657d-00000000012b 13273 1726853325.12705: variable 'ansible_search_path' from source: unknown 13273 1726853325.12708: variable 'ansible_search_path' from source: unknown 13273 1726853325.12713: calling self._execute() 13273 1726853325.12813: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853325.12832: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853325.12847: variable 'omit' from source: magic vars 13273 1726853325.13181: variable 'ansible_distribution_major_version' from source: facts 13273 1726853325.13197: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853325.13207: variable 'omit' from source: magic vars 13273 1726853325.13261: variable 'omit' from source: magic vars 13273 1726853325.13405: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13273 1726853325.15312: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13273 1726853325.15354: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13273 1726853325.15381: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13273 1726853325.15409: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13273 1726853325.15430: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13273 1726853325.15487: variable 'network_provider' from source: set_fact 13273 1726853325.15576: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853325.15596: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853325.15615: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853325.15640: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853325.15652: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853325.15705: variable 'omit' from source: magic vars 13273 1726853325.15781: variable 'omit' from source: magic vars 13273 1726853325.15851: variable 'network_connections' from source: task vars 13273 1726853325.15860: variable 'port1_profile' from source: play vars 13273 1726853325.15902: variable 'port1_profile' from source: play vars 13273 1726853325.15910: variable 'port2_profile' from source: play vars 13273 1726853325.15953: variable 'port2_profile' from source: play vars 13273 1726853325.16053: variable 'omit' from source: magic vars 13273 1726853325.16060: variable '__lsr_ansible_managed' from source: task vars 13273 1726853325.16101: variable '__lsr_ansible_managed' from source: task vars 13273 1726853325.16229: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 13273 1726853325.16369: Loaded config def from plugin (lookup/template) 13273 1726853325.16374: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 13273 1726853325.16394: File lookup term: get_ansible_managed.j2 13273 1726853325.16398: variable 'ansible_search_path' from source: unknown 13273 1726853325.16401: evaluation_path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 13273 1726853325.16412: search_path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 13273 1726853325.16424: variable 'ansible_search_path' from source: unknown 13273 1726853325.20163: variable 'ansible_managed' from source: unknown 13273 1726853325.20293: variable 'omit' from source: magic vars 13273 1726853325.20301: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13273 1726853325.20325: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13273 1726853325.20360: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13273 1726853325.20363: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853325.20365: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853325.20428: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13273 1726853325.20431: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853325.20434: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853325.20474: Set connection var ansible_connection to ssh 13273 1726853325.20482: Set connection var ansible_module_compression to ZIP_DEFLATED 13273 1726853325.20487: Set connection var ansible_shell_executable to /bin/sh 13273 1726853325.20500: Set connection var ansible_shell_type to sh 13273 1726853325.20527: Set connection var ansible_pipelining to False 13273 1726853325.20530: Set connection var ansible_timeout to 10 13273 1726853325.20533: variable 'ansible_shell_executable' from source: unknown 13273 1726853325.20535: variable 'ansible_connection' from source: unknown 13273 1726853325.20537: variable 'ansible_module_compression' from source: unknown 13273 1726853325.20540: variable 'ansible_shell_type' from source: unknown 13273 1726853325.20542: variable 'ansible_shell_executable' from source: unknown 13273 1726853325.20544: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853325.20551: variable 'ansible_pipelining' from source: unknown 13273 1726853325.20553: variable 'ansible_timeout' from source: unknown 13273 1726853325.20555: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853325.20699: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 13273 1726853325.20709: variable 'omit' from source: magic vars 13273 1726853325.20712: starting attempt loop 13273 1726853325.20717: running the handler 13273 1726853325.20725: _low_level_execute_command(): starting 13273 1726853325.20797: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13273 1726853325.21277: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13273 1726853325.21294: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853325.21302: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853325.21328: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13273 1726853325.21332: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 13273 1726853325.21400: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853325.21453: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853325.21457: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853325.21459: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853325.21556: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853325.23266: stdout chunk (state=3): >>>/root <<< 13273 1726853325.23381: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853325.23443: stderr chunk (state=3): >>><<< 13273 1726853325.23448: stdout chunk (state=3): >>><<< 13273 1726853325.23557: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853325.23561: _low_level_execute_command(): starting 13273 1726853325.23564: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853325.2346447-15251-133757085274147 `" && echo ansible-tmp-1726853325.2346447-15251-133757085274147="` echo /root/.ansible/tmp/ansible-tmp-1726853325.2346447-15251-133757085274147 `" ) && sleep 0' 13273 1726853325.24091: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853325.24157: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853325.24168: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853325.24246: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853325.26217: stdout chunk (state=3): >>>ansible-tmp-1726853325.2346447-15251-133757085274147=/root/.ansible/tmp/ansible-tmp-1726853325.2346447-15251-133757085274147 <<< 13273 1726853325.26325: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853325.26346: stderr chunk (state=3): >>><<< 13273 1726853325.26352: stdout chunk (state=3): >>><<< 13273 1726853325.26367: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853325.2346447-15251-133757085274147=/root/.ansible/tmp/ansible-tmp-1726853325.2346447-15251-133757085274147 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853325.26404: variable 'ansible_module_compression' from source: unknown 13273 1726853325.26437: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13273pukop8ph/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 13273 1726853325.26482: variable 'ansible_facts' from source: unknown 13273 1726853325.26574: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853325.2346447-15251-133757085274147/AnsiballZ_network_connections.py 13273 1726853325.26666: Sending initial data 13273 1726853325.26670: Sent initial data (168 bytes) 13273 1726853325.27107: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853325.27111: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 13273 1726853325.27117: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853325.27121: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853325.27123: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853325.27168: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853325.27177: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853325.27234: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853325.28831: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13273 1726853325.28897: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13273 1726853325.28983: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13273pukop8ph/tmpn455lofd /root/.ansible/tmp/ansible-tmp-1726853325.2346447-15251-133757085274147/AnsiballZ_network_connections.py <<< 13273 1726853325.28986: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853325.2346447-15251-133757085274147/AnsiballZ_network_connections.py" <<< 13273 1726853325.29035: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13273pukop8ph/tmpn455lofd" to remote "/root/.ansible/tmp/ansible-tmp-1726853325.2346447-15251-133757085274147/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853325.2346447-15251-133757085274147/AnsiballZ_network_connections.py" <<< 13273 1726853325.30058: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853325.30107: stderr chunk (state=3): >>><<< 13273 1726853325.30112: stdout chunk (state=3): >>><<< 13273 1726853325.30158: done transferring module to remote 13273 1726853325.30164: _low_level_execute_command(): starting 13273 1726853325.30168: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853325.2346447-15251-133757085274147/ /root/.ansible/tmp/ansible-tmp-1726853325.2346447-15251-133757085274147/AnsiballZ_network_connections.py && sleep 0' 13273 1726853325.30693: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853325.30757: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853325.30760: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853325.30833: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853325.32689: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853325.32713: stderr chunk (state=3): >>><<< 13273 1726853325.32716: stdout chunk (state=3): >>><<< 13273 1726853325.32731: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853325.32734: _low_level_execute_command(): starting 13273 1726853325.32738: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853325.2346447-15251-133757085274147/AnsiballZ_network_connections.py && sleep 0' 13273 1726853325.33196: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853325.33199: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 13273 1726853325.33201: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 13273 1726853325.33204: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853325.33206: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853325.33246: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853325.33260: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853325.33331: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853325.75987: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_lgyre7qi/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_lgyre7qi/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.0/4ad0cc7b-d5cb-4e1d-92ec-ada243ff0f9d: error=unknown <<< 13273 1726853325.77887: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_lgyre7qi/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_lgyre7qi/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.1/889074c2-782b-4057-b1eb-a43c769be906: error=unknown <<< 13273 1726853325.78099: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0.1", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0.1", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 13273 1726853325.80089: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 13273 1726853325.80114: stderr chunk (state=3): >>><<< 13273 1726853325.80135: stdout chunk (state=3): >>><<< 13273 1726853325.80285: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_lgyre7qi/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_lgyre7qi/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.0/4ad0cc7b-d5cb-4e1d-92ec-ada243ff0f9d: error=unknown Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_lgyre7qi/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_lgyre7qi/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.1/889074c2-782b-4057-b1eb-a43c769be906: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0.1", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0.1", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 13273 1726853325.80289: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'bond0.0', 'persistent_state': 'absent', 'state': 'down'}, {'name': 'bond0.1', 'persistent_state': 'absent', 'state': 'down'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853325.2346447-15251-133757085274147/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13273 1726853325.80292: _low_level_execute_command(): starting 13273 1726853325.80294: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853325.2346447-15251-133757085274147/ > /dev/null 2>&1 && sleep 0' 13273 1726853325.80886: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853325.80959: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853325.80980: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853325.81002: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853325.81101: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853325.83031: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853325.83035: stdout chunk (state=3): >>><<< 13273 1726853325.83038: stderr chunk (state=3): >>><<< 13273 1726853325.83176: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853325.83180: handler run complete 13273 1726853325.83182: attempt loop complete, returning result 13273 1726853325.83184: _execute() done 13273 1726853325.83187: dumping result to json 13273 1726853325.83189: done dumping result, returning 13273 1726853325.83191: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [02083763-bbaf-5fc3-657d-00000000012b] 13273 1726853325.83193: sending task result for task 02083763-bbaf-5fc3-657d-00000000012b 13273 1726853325.83267: done sending task result for task 02083763-bbaf-5fc3-657d-00000000012b 13273 1726853325.83273: WORKER PROCESS EXITING changed: [managed_node3] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "bond0.0", "persistent_state": "absent", "state": "down" }, { "name": "bond0.1", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 13273 1726853325.83385: no more pending results, returning what we have 13273 1726853325.83389: results queue empty 13273 1726853325.83390: checking for any_errors_fatal 13273 1726853325.83396: done checking for any_errors_fatal 13273 1726853325.83397: checking for max_fail_percentage 13273 1726853325.83399: done checking for max_fail_percentage 13273 1726853325.83400: checking to see if all hosts have failed and the running result is not ok 13273 1726853325.83400: done checking to see if all hosts have failed 13273 1726853325.83401: getting the remaining hosts for this loop 13273 1726853325.83403: done getting the remaining hosts for this loop 13273 1726853325.83406: getting the next task for host managed_node3 13273 1726853325.83413: done getting next task for host managed_node3 13273 1726853325.83417: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 13273 1726853325.83420: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853325.83433: getting variables 13273 1726853325.83435: in VariableManager get_vars() 13273 1726853325.83699: Calling all_inventory to load vars for managed_node3 13273 1726853325.83702: Calling groups_inventory to load vars for managed_node3 13273 1726853325.83705: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853325.83714: Calling all_plugins_play to load vars for managed_node3 13273 1726853325.83717: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853325.83721: Calling groups_plugins_play to load vars for managed_node3 13273 1726853325.85354: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853325.86900: done with get_vars() 13273 1726853325.86929: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 13:28:45 -0400 (0:00:00.750) 0:00:43.759 ****** 13273 1726853325.87027: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_state 13273 1726853325.87401: worker is 1 (out of 1 available) 13273 1726853325.87413: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_state 13273 1726853325.87427: done queuing things up, now waiting for results queue to drain 13273 1726853325.87429: waiting for pending results... 13273 1726853325.87892: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state 13273 1726853325.87906: in run() - task 02083763-bbaf-5fc3-657d-00000000012c 13273 1726853325.87928: variable 'ansible_search_path' from source: unknown 13273 1726853325.87935: variable 'ansible_search_path' from source: unknown 13273 1726853325.87985: calling self._execute() 13273 1726853325.88104: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853325.88121: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853325.88136: variable 'omit' from source: magic vars 13273 1726853325.88775: variable 'ansible_distribution_major_version' from source: facts 13273 1726853325.88780: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853325.88783: variable 'network_state' from source: role '' defaults 13273 1726853325.88785: Evaluated conditional (network_state != {}): False 13273 1726853325.88787: when evaluation is False, skipping this task 13273 1726853325.88789: _execute() done 13273 1726853325.88791: dumping result to json 13273 1726853325.88793: done dumping result, returning 13273 1726853325.88796: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state [02083763-bbaf-5fc3-657d-00000000012c] 13273 1726853325.88798: sending task result for task 02083763-bbaf-5fc3-657d-00000000012c 13273 1726853325.88877: done sending task result for task 02083763-bbaf-5fc3-657d-00000000012c 13273 1726853325.88881: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 13273 1726853325.88944: no more pending results, returning what we have 13273 1726853325.88951: results queue empty 13273 1726853325.88953: checking for any_errors_fatal 13273 1726853325.88968: done checking for any_errors_fatal 13273 1726853325.88969: checking for max_fail_percentage 13273 1726853325.88973: done checking for max_fail_percentage 13273 1726853325.88974: checking to see if all hosts have failed and the running result is not ok 13273 1726853325.88975: done checking to see if all hosts have failed 13273 1726853325.88976: getting the remaining hosts for this loop 13273 1726853325.88977: done getting the remaining hosts for this loop 13273 1726853325.88982: getting the next task for host managed_node3 13273 1726853325.88989: done getting next task for host managed_node3 13273 1726853325.88994: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 13273 1726853325.88999: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853325.89028: getting variables 13273 1726853325.89029: in VariableManager get_vars() 13273 1726853325.89262: Calling all_inventory to load vars for managed_node3 13273 1726853325.89266: Calling groups_inventory to load vars for managed_node3 13273 1726853325.89269: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853325.89285: Calling all_plugins_play to load vars for managed_node3 13273 1726853325.89289: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853325.89292: Calling groups_plugins_play to load vars for managed_node3 13273 1726853325.91557: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853325.95063: done with get_vars() 13273 1726853325.95088: done getting variables 13273 1726853325.95148: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 13:28:45 -0400 (0:00:00.085) 0:00:43.844 ****** 13273 1726853325.95561: entering _queue_task() for managed_node3/debug 13273 1726853325.96437: worker is 1 (out of 1 available) 13273 1726853325.96450: exiting _queue_task() for managed_node3/debug 13273 1726853325.96462: done queuing things up, now waiting for results queue to drain 13273 1726853325.96463: waiting for pending results... 13273 1726853325.97099: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 13273 1726853325.97657: in run() - task 02083763-bbaf-5fc3-657d-00000000012d 13273 1726853325.97661: variable 'ansible_search_path' from source: unknown 13273 1726853325.97664: variable 'ansible_search_path' from source: unknown 13273 1726853325.97667: calling self._execute() 13273 1726853325.98075: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853325.98080: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853325.98083: variable 'omit' from source: magic vars 13273 1726853325.98901: variable 'ansible_distribution_major_version' from source: facts 13273 1726853325.98979: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853325.98992: variable 'omit' from source: magic vars 13273 1726853325.99070: variable 'omit' from source: magic vars 13273 1726853325.99148: variable 'omit' from source: magic vars 13273 1726853325.99210: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13273 1726853325.99266: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13273 1726853325.99364: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13273 1726853325.99373: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853325.99377: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853325.99380: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13273 1726853325.99389: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853325.99397: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853325.99520: Set connection var ansible_connection to ssh 13273 1726853325.99539: Set connection var ansible_module_compression to ZIP_DEFLATED 13273 1726853325.99555: Set connection var ansible_shell_executable to /bin/sh 13273 1726853325.99562: Set connection var ansible_shell_type to sh 13273 1726853325.99579: Set connection var ansible_pipelining to False 13273 1726853325.99597: Set connection var ansible_timeout to 10 13273 1726853325.99631: variable 'ansible_shell_executable' from source: unknown 13273 1726853325.99640: variable 'ansible_connection' from source: unknown 13273 1726853325.99676: variable 'ansible_module_compression' from source: unknown 13273 1726853325.99679: variable 'ansible_shell_type' from source: unknown 13273 1726853325.99684: variable 'ansible_shell_executable' from source: unknown 13273 1726853325.99689: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853325.99691: variable 'ansible_pipelining' from source: unknown 13273 1726853325.99693: variable 'ansible_timeout' from source: unknown 13273 1726853325.99699: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853325.99978: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13273 1726853325.99983: variable 'omit' from source: magic vars 13273 1726853325.99985: starting attempt loop 13273 1726853325.99988: running the handler 13273 1726853326.00115: variable '__network_connections_result' from source: set_fact 13273 1726853326.00182: handler run complete 13273 1726853326.00207: attempt loop complete, returning result 13273 1726853326.00215: _execute() done 13273 1726853326.00223: dumping result to json 13273 1726853326.00236: done dumping result, returning 13273 1726853326.00255: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [02083763-bbaf-5fc3-657d-00000000012d] 13273 1726853326.00268: sending task result for task 02083763-bbaf-5fc3-657d-00000000012d 13273 1726853326.00421: done sending task result for task 02083763-bbaf-5fc3-657d-00000000012d 13273 1726853326.00425: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result.stderr_lines": [ "" ] } 13273 1726853326.00535: no more pending results, returning what we have 13273 1726853326.00539: results queue empty 13273 1726853326.00540: checking for any_errors_fatal 13273 1726853326.00552: done checking for any_errors_fatal 13273 1726853326.00553: checking for max_fail_percentage 13273 1726853326.00555: done checking for max_fail_percentage 13273 1726853326.00556: checking to see if all hosts have failed and the running result is not ok 13273 1726853326.00557: done checking to see if all hosts have failed 13273 1726853326.00558: getting the remaining hosts for this loop 13273 1726853326.00559: done getting the remaining hosts for this loop 13273 1726853326.00563: getting the next task for host managed_node3 13273 1726853326.00684: done getting next task for host managed_node3 13273 1726853326.00688: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 13273 1726853326.00692: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853326.00705: getting variables 13273 1726853326.00707: in VariableManager get_vars() 13273 1726853326.00765: Calling all_inventory to load vars for managed_node3 13273 1726853326.00768: Calling groups_inventory to load vars for managed_node3 13273 1726853326.00875: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853326.00885: Calling all_plugins_play to load vars for managed_node3 13273 1726853326.00892: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853326.00896: Calling groups_plugins_play to load vars for managed_node3 13273 1726853326.02580: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853326.04172: done with get_vars() 13273 1726853326.04197: done getting variables 13273 1726853326.04267: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 13:28:46 -0400 (0:00:00.087) 0:00:43.932 ****** 13273 1726853326.04304: entering _queue_task() for managed_node3/debug 13273 1726853326.04660: worker is 1 (out of 1 available) 13273 1726853326.04674: exiting _queue_task() for managed_node3/debug 13273 1726853326.04798: done queuing things up, now waiting for results queue to drain 13273 1726853326.04800: waiting for pending results... 13273 1726853326.05091: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 13273 1726853326.05165: in run() - task 02083763-bbaf-5fc3-657d-00000000012e 13273 1726853326.05191: variable 'ansible_search_path' from source: unknown 13273 1726853326.05230: variable 'ansible_search_path' from source: unknown 13273 1726853326.05249: calling self._execute() 13273 1726853326.05360: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853326.05374: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853326.05403: variable 'omit' from source: magic vars 13273 1726853326.05796: variable 'ansible_distribution_major_version' from source: facts 13273 1726853326.05838: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853326.05841: variable 'omit' from source: magic vars 13273 1726853326.05892: variable 'omit' from source: magic vars 13273 1726853326.05931: variable 'omit' from source: magic vars 13273 1726853326.05989: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13273 1726853326.06055: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13273 1726853326.06059: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13273 1726853326.06084: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853326.06103: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853326.06136: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13273 1726853326.06163: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853326.06166: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853326.06262: Set connection var ansible_connection to ssh 13273 1726853326.06285: Set connection var ansible_module_compression to ZIP_DEFLATED 13273 1726853326.06312: Set connection var ansible_shell_executable to /bin/sh 13273 1726853326.06315: Set connection var ansible_shell_type to sh 13273 1726853326.06317: Set connection var ansible_pipelining to False 13273 1726853326.06421: Set connection var ansible_timeout to 10 13273 1726853326.06424: variable 'ansible_shell_executable' from source: unknown 13273 1726853326.06426: variable 'ansible_connection' from source: unknown 13273 1726853326.06428: variable 'ansible_module_compression' from source: unknown 13273 1726853326.06430: variable 'ansible_shell_type' from source: unknown 13273 1726853326.06432: variable 'ansible_shell_executable' from source: unknown 13273 1726853326.06434: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853326.06436: variable 'ansible_pipelining' from source: unknown 13273 1726853326.06438: variable 'ansible_timeout' from source: unknown 13273 1726853326.06440: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853326.06551: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13273 1726853326.06572: variable 'omit' from source: magic vars 13273 1726853326.06584: starting attempt loop 13273 1726853326.06592: running the handler 13273 1726853326.06650: variable '__network_connections_result' from source: set_fact 13273 1726853326.06749: variable '__network_connections_result' from source: set_fact 13273 1726853326.06860: handler run complete 13273 1726853326.06993: attempt loop complete, returning result 13273 1726853326.06996: _execute() done 13273 1726853326.06998: dumping result to json 13273 1726853326.06999: done dumping result, returning 13273 1726853326.07002: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [02083763-bbaf-5fc3-657d-00000000012e] 13273 1726853326.07003: sending task result for task 02083763-bbaf-5fc3-657d-00000000012e 13273 1726853326.07069: done sending task result for task 02083763-bbaf-5fc3-657d-00000000012e 13273 1726853326.07073: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "bond0.0", "persistent_state": "absent", "state": "down" }, { "name": "bond0.1", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 13273 1726853326.07191: no more pending results, returning what we have 13273 1726853326.07195: results queue empty 13273 1726853326.07196: checking for any_errors_fatal 13273 1726853326.07202: done checking for any_errors_fatal 13273 1726853326.07202: checking for max_fail_percentage 13273 1726853326.07204: done checking for max_fail_percentage 13273 1726853326.07205: checking to see if all hosts have failed and the running result is not ok 13273 1726853326.07205: done checking to see if all hosts have failed 13273 1726853326.07206: getting the remaining hosts for this loop 13273 1726853326.07207: done getting the remaining hosts for this loop 13273 1726853326.07210: getting the next task for host managed_node3 13273 1726853326.07217: done getting next task for host managed_node3 13273 1726853326.07221: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 13273 1726853326.07224: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853326.07237: getting variables 13273 1726853326.07239: in VariableManager get_vars() 13273 1726853326.07560: Calling all_inventory to load vars for managed_node3 13273 1726853326.07563: Calling groups_inventory to load vars for managed_node3 13273 1726853326.07565: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853326.07575: Calling all_plugins_play to load vars for managed_node3 13273 1726853326.07578: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853326.07582: Calling groups_plugins_play to load vars for managed_node3 13273 1726853326.09039: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853326.10584: done with get_vars() 13273 1726853326.10605: done getting variables 13273 1726853326.10663: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 13:28:46 -0400 (0:00:00.063) 0:00:43.996 ****** 13273 1726853326.10701: entering _queue_task() for managed_node3/debug 13273 1726853326.11006: worker is 1 (out of 1 available) 13273 1726853326.11017: exiting _queue_task() for managed_node3/debug 13273 1726853326.11030: done queuing things up, now waiting for results queue to drain 13273 1726853326.11031: waiting for pending results... 13273 1726853326.11309: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 13273 1726853326.11477: in run() - task 02083763-bbaf-5fc3-657d-00000000012f 13273 1726853326.11481: variable 'ansible_search_path' from source: unknown 13273 1726853326.11483: variable 'ansible_search_path' from source: unknown 13273 1726853326.11518: calling self._execute() 13273 1726853326.11620: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853326.11675: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853326.11678: variable 'omit' from source: magic vars 13273 1726853326.12025: variable 'ansible_distribution_major_version' from source: facts 13273 1726853326.12049: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853326.12180: variable 'network_state' from source: role '' defaults 13273 1726853326.12196: Evaluated conditional (network_state != {}): False 13273 1726853326.12204: when evaluation is False, skipping this task 13273 1726853326.12210: _execute() done 13273 1726853326.12254: dumping result to json 13273 1726853326.12257: done dumping result, returning 13273 1726853326.12260: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [02083763-bbaf-5fc3-657d-00000000012f] 13273 1726853326.12262: sending task result for task 02083763-bbaf-5fc3-657d-00000000012f skipping: [managed_node3] => { "false_condition": "network_state != {}" } 13273 1726853326.12608: no more pending results, returning what we have 13273 1726853326.12611: results queue empty 13273 1726853326.12612: checking for any_errors_fatal 13273 1726853326.12619: done checking for any_errors_fatal 13273 1726853326.12620: checking for max_fail_percentage 13273 1726853326.12621: done checking for max_fail_percentage 13273 1726853326.12623: checking to see if all hosts have failed and the running result is not ok 13273 1726853326.12623: done checking to see if all hosts have failed 13273 1726853326.12624: getting the remaining hosts for this loop 13273 1726853326.12625: done getting the remaining hosts for this loop 13273 1726853326.12628: getting the next task for host managed_node3 13273 1726853326.12634: done getting next task for host managed_node3 13273 1726853326.12637: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 13273 1726853326.12640: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853326.12661: getting variables 13273 1726853326.12662: in VariableManager get_vars() 13273 1726853326.12705: Calling all_inventory to load vars for managed_node3 13273 1726853326.12708: Calling groups_inventory to load vars for managed_node3 13273 1726853326.12710: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853326.12718: Calling all_plugins_play to load vars for managed_node3 13273 1726853326.12720: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853326.12722: Calling groups_plugins_play to load vars for managed_node3 13273 1726853326.13284: done sending task result for task 02083763-bbaf-5fc3-657d-00000000012f 13273 1726853326.13288: WORKER PROCESS EXITING 13273 1726853326.14039: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853326.15600: done with get_vars() 13273 1726853326.15621: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 13:28:46 -0400 (0:00:00.050) 0:00:44.046 ****** 13273 1726853326.15715: entering _queue_task() for managed_node3/ping 13273 1726853326.16025: worker is 1 (out of 1 available) 13273 1726853326.16038: exiting _queue_task() for managed_node3/ping 13273 1726853326.16053: done queuing things up, now waiting for results queue to drain 13273 1726853326.16055: waiting for pending results... 13273 1726853326.16335: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 13273 1726853326.16469: in run() - task 02083763-bbaf-5fc3-657d-000000000130 13273 1726853326.16498: variable 'ansible_search_path' from source: unknown 13273 1726853326.16508: variable 'ansible_search_path' from source: unknown 13273 1726853326.16550: calling self._execute() 13273 1726853326.16656: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853326.16667: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853326.16682: variable 'omit' from source: magic vars 13273 1726853326.17063: variable 'ansible_distribution_major_version' from source: facts 13273 1726853326.17083: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853326.17096: variable 'omit' from source: magic vars 13273 1726853326.17155: variable 'omit' from source: magic vars 13273 1726853326.17194: variable 'omit' from source: magic vars 13273 1726853326.17233: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13273 1726853326.17276: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13273 1726853326.17298: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13273 1726853326.17317: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853326.17332: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853326.17373: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13273 1726853326.17383: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853326.17390: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853326.17494: Set connection var ansible_connection to ssh 13273 1726853326.17507: Set connection var ansible_module_compression to ZIP_DEFLATED 13273 1726853326.17515: Set connection var ansible_shell_executable to /bin/sh 13273 1726853326.17521: Set connection var ansible_shell_type to sh 13273 1726853326.17528: Set connection var ansible_pipelining to False 13273 1726853326.17536: Set connection var ansible_timeout to 10 13273 1726853326.17565: variable 'ansible_shell_executable' from source: unknown 13273 1726853326.17574: variable 'ansible_connection' from source: unknown 13273 1726853326.17582: variable 'ansible_module_compression' from source: unknown 13273 1726853326.17587: variable 'ansible_shell_type' from source: unknown 13273 1726853326.17593: variable 'ansible_shell_executable' from source: unknown 13273 1726853326.17599: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853326.17606: variable 'ansible_pipelining' from source: unknown 13273 1726853326.17612: variable 'ansible_timeout' from source: unknown 13273 1726853326.17619: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853326.17821: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 13273 1726853326.17836: variable 'omit' from source: magic vars 13273 1726853326.17844: starting attempt loop 13273 1726853326.17853: running the handler 13273 1726853326.17868: _low_level_execute_command(): starting 13273 1726853326.17879: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13273 1726853326.18578: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13273 1726853326.18594: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853326.18609: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853326.18661: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853326.18731: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853326.18752: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853326.18780: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853326.18893: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853326.20659: stdout chunk (state=3): >>>/root <<< 13273 1726853326.20929: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853326.20932: stdout chunk (state=3): >>><<< 13273 1726853326.20934: stderr chunk (state=3): >>><<< 13273 1726853326.20937: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853326.20939: _low_level_execute_command(): starting 13273 1726853326.20942: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853326.2088754-15287-128022493886197 `" && echo ansible-tmp-1726853326.2088754-15287-128022493886197="` echo /root/.ansible/tmp/ansible-tmp-1726853326.2088754-15287-128022493886197 `" ) && sleep 0' 13273 1726853326.21806: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853326.21820: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853326.21845: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853326.21927: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853326.21981: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853326.24085: stdout chunk (state=3): >>>ansible-tmp-1726853326.2088754-15287-128022493886197=/root/.ansible/tmp/ansible-tmp-1726853326.2088754-15287-128022493886197 <<< 13273 1726853326.24088: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853326.24376: stderr chunk (state=3): >>><<< 13273 1726853326.24379: stdout chunk (state=3): >>><<< 13273 1726853326.24382: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853326.2088754-15287-128022493886197=/root/.ansible/tmp/ansible-tmp-1726853326.2088754-15287-128022493886197 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853326.24384: variable 'ansible_module_compression' from source: unknown 13273 1726853326.24387: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13273pukop8ph/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 13273 1726853326.24389: variable 'ansible_facts' from source: unknown 13273 1726853326.24597: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853326.2088754-15287-128022493886197/AnsiballZ_ping.py 13273 1726853326.24755: Sending initial data 13273 1726853326.24764: Sent initial data (153 bytes) 13273 1726853326.25321: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13273 1726853326.25423: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853326.25438: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853326.25521: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853326.27278: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13273 1726853326.27300: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13273 1726853326.27362: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13273pukop8ph/tmptn41m2h_ /root/.ansible/tmp/ansible-tmp-1726853326.2088754-15287-128022493886197/AnsiballZ_ping.py <<< 13273 1726853326.27366: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853326.2088754-15287-128022493886197/AnsiballZ_ping.py" <<< 13273 1726853326.27479: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13273pukop8ph/tmptn41m2h_" to remote "/root/.ansible/tmp/ansible-tmp-1726853326.2088754-15287-128022493886197/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853326.2088754-15287-128022493886197/AnsiballZ_ping.py" <<< 13273 1726853326.28689: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853326.28733: stderr chunk (state=3): >>><<< 13273 1726853326.28736: stdout chunk (state=3): >>><<< 13273 1726853326.28745: done transferring module to remote 13273 1726853326.28761: _low_level_execute_command(): starting 13273 1726853326.28785: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853326.2088754-15287-128022493886197/ /root/.ansible/tmp/ansible-tmp-1726853326.2088754-15287-128022493886197/AnsiballZ_ping.py && sleep 0' 13273 1726853326.29605: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853326.29619: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853326.29673: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853326.29677: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853326.29727: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853326.31718: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853326.31735: stderr chunk (state=3): >>><<< 13273 1726853326.31749: stdout chunk (state=3): >>><<< 13273 1726853326.32061: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853326.32065: _low_level_execute_command(): starting 13273 1726853326.32067: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853326.2088754-15287-128022493886197/AnsiballZ_ping.py && sleep 0' 13273 1726853326.32727: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13273 1726853326.32741: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853326.32760: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853326.32781: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13273 1726853326.32799: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 13273 1726853326.32812: stderr chunk (state=3): >>>debug2: match not found <<< 13273 1726853326.32826: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853326.32888: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853326.32935: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853326.32955: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853326.32978: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853326.33143: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853326.48734: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 13273 1726853326.50099: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 13273 1726853326.50103: stderr chunk (state=3): >>><<< 13273 1726853326.50106: stdout chunk (state=3): >>><<< 13273 1726853326.50118: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 13273 1726853326.50137: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853326.2088754-15287-128022493886197/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13273 1726853326.50147: _low_level_execute_command(): starting 13273 1726853326.50157: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853326.2088754-15287-128022493886197/ > /dev/null 2>&1 && sleep 0' 13273 1726853326.50652: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853326.50703: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853326.50710: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853326.50713: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853326.50851: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853326.52640: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853326.52683: stderr chunk (state=3): >>><<< 13273 1726853326.52686: stdout chunk (state=3): >>><<< 13273 1726853326.52698: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853326.52707: handler run complete 13273 1726853326.52721: attempt loop complete, returning result 13273 1726853326.52790: _execute() done 13273 1726853326.52793: dumping result to json 13273 1726853326.52795: done dumping result, returning 13273 1726853326.52800: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [02083763-bbaf-5fc3-657d-000000000130] 13273 1726853326.52802: sending task result for task 02083763-bbaf-5fc3-657d-000000000130 13273 1726853326.52889: done sending task result for task 02083763-bbaf-5fc3-657d-000000000130 13273 1726853326.52892: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "ping": "pong" } 13273 1726853326.52970: no more pending results, returning what we have 13273 1726853326.52974: results queue empty 13273 1726853326.52975: checking for any_errors_fatal 13273 1726853326.52981: done checking for any_errors_fatal 13273 1726853326.52981: checking for max_fail_percentage 13273 1726853326.52983: done checking for max_fail_percentage 13273 1726853326.52984: checking to see if all hosts have failed and the running result is not ok 13273 1726853326.52985: done checking to see if all hosts have failed 13273 1726853326.52986: getting the remaining hosts for this loop 13273 1726853326.52987: done getting the remaining hosts for this loop 13273 1726853326.52990: getting the next task for host managed_node3 13273 1726853326.53001: done getting next task for host managed_node3 13273 1726853326.53003: ^ task is: TASK: meta (role_complete) 13273 1726853326.53006: ^ state is: HOST STATE: block=2, task=30, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853326.53016: getting variables 13273 1726853326.53018: in VariableManager get_vars() 13273 1726853326.53066: Calling all_inventory to load vars for managed_node3 13273 1726853326.53069: Calling groups_inventory to load vars for managed_node3 13273 1726853326.53110: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853326.53120: Calling all_plugins_play to load vars for managed_node3 13273 1726853326.53123: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853326.53125: Calling groups_plugins_play to load vars for managed_node3 13273 1726853326.59847: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853326.61448: done with get_vars() 13273 1726853326.61480: done getting variables 13273 1726853326.61595: done queuing things up, now waiting for results queue to drain 13273 1726853326.61600: results queue empty 13273 1726853326.61601: checking for any_errors_fatal 13273 1726853326.61609: done checking for any_errors_fatal 13273 1726853326.61610: checking for max_fail_percentage 13273 1726853326.61611: done checking for max_fail_percentage 13273 1726853326.61612: checking to see if all hosts have failed and the running result is not ok 13273 1726853326.61613: done checking to see if all hosts have failed 13273 1726853326.61614: getting the remaining hosts for this loop 13273 1726853326.61615: done getting the remaining hosts for this loop 13273 1726853326.61618: getting the next task for host managed_node3 13273 1726853326.61621: done getting next task for host managed_node3 13273 1726853326.61625: ^ task is: TASK: From the active connection, get the controller profile "{{ controller_profile }}" 13273 1726853326.61628: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853326.61631: getting variables 13273 1726853326.61632: in VariableManager get_vars() 13273 1726853326.61654: Calling all_inventory to load vars for managed_node3 13273 1726853326.61657: Calling groups_inventory to load vars for managed_node3 13273 1726853326.61659: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853326.61664: Calling all_plugins_play to load vars for managed_node3 13273 1726853326.61666: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853326.61679: Calling groups_plugins_play to load vars for managed_node3 13273 1726853326.62603: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853326.63555: done with get_vars() 13273 1726853326.63573: done getting variables 13273 1726853326.63617: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 13273 1726853326.63693: variable 'controller_profile' from source: play vars TASK [From the active connection, get the controller profile "bond0"] ********** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:200 Friday 20 September 2024 13:28:46 -0400 (0:00:00.479) 0:00:44.526 ****** 13273 1726853326.63712: entering _queue_task() for managed_node3/command 13273 1726853326.64080: worker is 1 (out of 1 available) 13273 1726853326.64093: exiting _queue_task() for managed_node3/command 13273 1726853326.64104: done queuing things up, now waiting for results queue to drain 13273 1726853326.64105: waiting for pending results... 13273 1726853326.64593: running TaskExecutor() for managed_node3/TASK: From the active connection, get the controller profile "bond0" 13273 1726853326.64603: in run() - task 02083763-bbaf-5fc3-657d-000000000160 13273 1726853326.64606: variable 'ansible_search_path' from source: unknown 13273 1726853326.64625: calling self._execute() 13273 1726853326.64748: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853326.64777: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853326.64780: variable 'omit' from source: magic vars 13273 1726853326.65197: variable 'ansible_distribution_major_version' from source: facts 13273 1726853326.65207: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853326.65300: variable 'network_provider' from source: set_fact 13273 1726853326.65303: Evaluated conditional (network_provider == "nm"): True 13273 1726853326.65311: variable 'omit' from source: magic vars 13273 1726853326.65328: variable 'omit' from source: magic vars 13273 1726853326.65394: variable 'controller_profile' from source: play vars 13273 1726853326.65414: variable 'omit' from source: magic vars 13273 1726853326.65453: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13273 1726853326.65481: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13273 1726853326.65497: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13273 1726853326.65514: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853326.65525: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853326.65551: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13273 1726853326.65554: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853326.65557: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853326.65637: Set connection var ansible_connection to ssh 13273 1726853326.65646: Set connection var ansible_module_compression to ZIP_DEFLATED 13273 1726853326.65650: Set connection var ansible_shell_executable to /bin/sh 13273 1726853326.65652: Set connection var ansible_shell_type to sh 13273 1726853326.65654: Set connection var ansible_pipelining to False 13273 1726853326.65705: Set connection var ansible_timeout to 10 13273 1726853326.65708: variable 'ansible_shell_executable' from source: unknown 13273 1726853326.65711: variable 'ansible_connection' from source: unknown 13273 1726853326.65719: variable 'ansible_module_compression' from source: unknown 13273 1726853326.65725: variable 'ansible_shell_type' from source: unknown 13273 1726853326.65729: variable 'ansible_shell_executable' from source: unknown 13273 1726853326.65731: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853326.65733: variable 'ansible_pipelining' from source: unknown 13273 1726853326.65735: variable 'ansible_timeout' from source: unknown 13273 1726853326.65738: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853326.65892: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13273 1726853326.65895: variable 'omit' from source: magic vars 13273 1726853326.65898: starting attempt loop 13273 1726853326.65900: running the handler 13273 1726853326.65903: _low_level_execute_command(): starting 13273 1726853326.65908: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13273 1726853326.66476: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853326.66500: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 13273 1726853326.66506: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853326.66567: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853326.66633: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853326.68347: stdout chunk (state=3): >>>/root <<< 13273 1726853326.68460: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853326.68498: stderr chunk (state=3): >>><<< 13273 1726853326.68502: stdout chunk (state=3): >>><<< 13273 1726853326.68530: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853326.68557: _low_level_execute_command(): starting 13273 1726853326.68561: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853326.685306-15310-154728173039734 `" && echo ansible-tmp-1726853326.685306-15310-154728173039734="` echo /root/.ansible/tmp/ansible-tmp-1726853326.685306-15310-154728173039734 `" ) && sleep 0' 13273 1726853326.69064: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853326.69068: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853326.69081: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853326.69120: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853326.69124: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853326.69200: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853326.71199: stdout chunk (state=3): >>>ansible-tmp-1726853326.685306-15310-154728173039734=/root/.ansible/tmp/ansible-tmp-1726853326.685306-15310-154728173039734 <<< 13273 1726853326.71306: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853326.71374: stderr chunk (state=3): >>><<< 13273 1726853326.71379: stdout chunk (state=3): >>><<< 13273 1726853326.71410: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853326.685306-15310-154728173039734=/root/.ansible/tmp/ansible-tmp-1726853326.685306-15310-154728173039734 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853326.71453: variable 'ansible_module_compression' from source: unknown 13273 1726853326.71506: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13273pukop8ph/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 13273 1726853326.71535: variable 'ansible_facts' from source: unknown 13273 1726853326.71596: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853326.685306-15310-154728173039734/AnsiballZ_command.py 13273 1726853326.71695: Sending initial data 13273 1726853326.71699: Sent initial data (155 bytes) 13273 1726853326.72427: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853326.72437: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853326.72521: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853326.72552: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853326.74316: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13273 1726853326.74374: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13273 1726853326.74435: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13273pukop8ph/tmpi85r39s4 /root/.ansible/tmp/ansible-tmp-1726853326.685306-15310-154728173039734/AnsiballZ_command.py <<< 13273 1726853326.74438: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853326.685306-15310-154728173039734/AnsiballZ_command.py" <<< 13273 1726853326.74502: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13273pukop8ph/tmpi85r39s4" to remote "/root/.ansible/tmp/ansible-tmp-1726853326.685306-15310-154728173039734/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853326.685306-15310-154728173039734/AnsiballZ_command.py" <<< 13273 1726853326.75145: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853326.75158: stderr chunk (state=3): >>><<< 13273 1726853326.75161: stdout chunk (state=3): >>><<< 13273 1726853326.75180: done transferring module to remote 13273 1726853326.75189: _low_level_execute_command(): starting 13273 1726853326.75194: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853326.685306-15310-154728173039734/ /root/.ansible/tmp/ansible-tmp-1726853326.685306-15310-154728173039734/AnsiballZ_command.py && sleep 0' 13273 1726853326.75605: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853326.75608: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853326.75611: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853326.75613: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853326.75654: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853326.75674: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853326.75733: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853326.77706: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853326.77709: stdout chunk (state=3): >>><<< 13273 1726853326.77712: stderr chunk (state=3): >>><<< 13273 1726853326.77801: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853326.77805: _low_level_execute_command(): starting 13273 1726853326.77808: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853326.685306-15310-154728173039734/AnsiballZ_command.py && sleep 0' 13273 1726853326.78420: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13273 1726853326.78445: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853326.78501: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 13273 1726853326.78516: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853326.78595: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13273 1726853326.78611: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853326.78658: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853326.78743: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853326.96122: stdout chunk (state=3): >>> {"changed": true, "stdout": "connection.id: bond0\nconnection.uuid: 3903334e-7358-4806-a114-5ea6dbf2cacf\nconnection.stable-id: --\nconnection.type: bond\nconnection.interface-name: nm-bond\nconnection.autoconnect: yes\nconnection.autoconnect-priority: 0\nconnection.autoconnect-retries: -1 (default)\nconnection.multi-connect: 0 (default)\nconnection.auth-retries: -1\nconnection.timestamp: 1726853319\nconnection.permissions: --\nconnection.zone: --\nconnection.controller: --\nconnection.master: --\nconnection.slave-type: --\nconnection.port-type: --\nconnection.autoconnect-slaves: -1 (default)\nconnection.autoconnect-ports: -1 (default)\nconnection.down-on-poweroff: -1 (default)\nconnection.secondaries: --\nconnection.gateway-ping-timeout: 0\nconnection.metered: unknown\nconnection.lldp: default\nconnection.mdns: -1 (default)\nconnection.llmnr: -1 (default)\nconnection.dns-over-tls: -1 (default)\nconnection.mptcp-flags: 0x0 (default)\nconnection.wait-device-timeout: -1\nconnection.wait-activation-delay: -1\nipv4.method: auto\nipv4.dns: --\nipv4.dns-search: --\nipv4.dns-options: --\nipv4.dns-priority: 0\nipv4.addresses: --\nipv4.gateway: --\nipv4.routes: --\nipv4.route-metric: 65535\nipv4.route-table: 0 (unspec)\nipv4.routing-rules: --\nipv4.replace-local-rule: -1 (default)\nipv4.dhcp-send-release: -1 (default)\nipv4.ignore-auto-routes: no\nipv4.ignore-auto-dns: no\nipv4.dhcp-client-id: --\nipv4.dhcp-iaid: --\nipv4.dhcp-dscp: --\nipv4.dhcp-timeout: 0 (default)\nipv4.dhcp-send-hostname: yes\nipv4.dhcp-hostname: --\nipv4.dhcp-fqdn: --\nipv4.dhcp-hostname-flags: 0x0 (none)\nipv4.never-default: no\nipv4.may-fail: yes\nipv4.required-timeout: -1 (default)\nipv4.dad-timeout: -1 (default)\nipv4.dhcp-vendor-class-identifier: --\nipv4.link-local: 0 (default)\nipv4.dhcp-reject-servers: --\nipv4.auto-route-ext-gw: -1 (default)\nipv6.method: auto\nipv6.dns: --\nipv6.dns-search: --\nipv6.dns-options: --\nipv6.dns-priority: 0\nipv6.addresses: --\nipv6.gateway: --\nipv6.routes: --\nipv6.route-metric: -1\nipv6.route-table: 0 (unspec)\nipv6.routing-rules: --\nipv6.replace-local-rule: -1 (default)\nipv6.dhcp-send-release: -1 (default)\nipv6.ignore-auto-routes: no\nipv6.ignore-auto-dns: no\nipv6.never-default: no\nipv6.may-fail: yes\nipv6.required-timeout: -1 (default)\nipv6.ip6-privacy: -1 (default)\nipv6.temp-valid-lifetime: 0 (default)\nipv6.temp-preferred-lifetime: 0 (default)\nipv6.addr-gen-mode: default\nipv6.ra-timeout: 0 (default)\nipv6.mtu: auto\<<< 13273 1726853326.96156: stdout chunk (state=3): >>>nipv6.dhcp-pd-hint: --\nipv6.dhcp-duid: --\nipv6.dhcp-iaid: --\nipv6.dhcp-timeout: 0 (default)\nipv6.dhcp-send-hostname: yes\nipv6.dhcp-hostname: --\nipv6.dhcp-hostname-flags: 0x0 (none)\nipv6.auto-route-ext-gw: -1 (default)\nipv6.token: --\nbond.options: mode=active-backup,miimon=110\nproxy.method: none\nproxy.browser-only: no\nproxy.pac-url: --\nproxy.pac-script: --\nGENERAL.NAME: bond0\nGENERAL.UUID: 3903334e-7358-4806-a114-5ea6dbf2cacf\nGENERAL.DEVICES: nm-bond\nGENERAL.IP-IFACE: nm-bond\nGENERAL.STATE: activated\nGENERAL.DEFAULT: no\nGENERAL.DEFAULT6: yes\nGENERAL.SPEC-OBJECT: --\nGENERAL.VPN: no\nGENERAL.DBUS-PATH: /org/freedesktop/NetworkManager/ActiveConnection/28\nGENERAL.CON-PATH: /org/freedesktop/NetworkManager/Settings/23\nGENERAL.ZONE: --\nGENERAL.MASTER-PATH: --\nIP4.ADDRESS[1]: 192.0.2.225/24\nIP4.GATEWAY: 192.0.2.1\nIP4.ROUTE[1]: dst = 0.0.0.0/0, nh = 192.0.2.1, mt = 65535\nIP4.ROUTE[2]: dst = 192.0.2.0/24, nh = 0.0.0.0, mt = 65535\nIP4.DNS[1]: 192.0.2.1\nDHCP4.OPTION[1]: broadcast_address = 192.0.2.255\nDHCP4.OPTION[2]: dhcp_client_identifier = 01:3a:0b:9d:f3:04:69\nDHCP4.OPTION[3]: dhcp_lease_time = 240\nDHCP4.OPTION[4]: dhcp_server_identifier = 192.0.2.1\nDHCP4.OPTION[5]: domain_name_servers = 192.0.2.1\nDHCP4.OPTION[6]: expiry = 1726853559\nDHCP4.OPTION[7]: host_name = ip-10-31-11-217\nDHCP4.OPTION[8]: ip_address = 192.0.2.225\nDHCP4.OPTION[9]: next_server = 192.0.2.1\nDHCP4.OPTION[10]: requested_broadcast_address = 1\nDHCP4.OPTION[11]: requested_domain_name = 1\nDHCP4.OPTION[12]: requested_domain_name_servers = 1\nDHCP4.OPTION[13]: requested_domain_search = 1\nDHCP4.OPTION[14]: requested_host_name = 1\nDHCP4.OPTION[15]: requested_interface_mtu = 1\nDHCP4.OPTION[16]: requested_ms_classless_static_routes = 1\nDHCP4.OPTION[17]: requested_nis_domain = 1\nDHCP4.OPTION[18]: requested_nis_servers = 1\nDHCP4.OPTION[19]: requested_ntp_servers = 1\nDHCP4.OPTION[20]: requested_rfc3442_classless_static_routes = 1\nDHCP4.OPTION[21]: requested_root_path = 1\nDHCP4.OPTION[22]: requested_routers = 1\nDHCP4.OPTION[23]: requested_static_routes = 1\nDHCP4.OPTION[24]: requested_subnet_mask = 1\nDHCP4.OPTION[25]: requested_time_offset = 1\nDHCP4.OPTION[26]: requested_wpad = 1\nDHCP4.OPTION[27]: routers = 192.0.2.1\nDHCP4.OPTION[28]: subnet_mask = 255.255.255.0\nIP6.ADDRESS[1]: 2001:db8::86/128\nIP6.ADDRESS[2]: 2001:db8::380b:9dff:fef3:469/64\nIP6.ADDRESS[3]: fe80::380b:9dff:fef3:469/64\nIP6.GATEWAY: fe80::b48a:8aff:feef:317f\nIP6.ROUTE[1]: dst = 2001:db8::86/128, nh = ::, mt = 300\nIP6.ROUTE[2]: dst = 2001:db8::/64, nh = ::, mt = 300\nIP6.ROUTE[3]: dst = fe80::/64, nh = ::, mt = 1024\nIP6.ROUTE[4]: dst = ::/0, nh = fe80::b48a:8aff:feef:317f, mt = 300\nIP6.DNS[1]: 2001:db8::68f1:14ff:fe28:32e6\nIP6.DNS[2]: fe80::b48a:8aff:feef:317f\nDHCP6.OPTION[1]: dhcp6_client_id = 00:04:63:43:fa:a7:0c:5b:23:53:97:5b:3b:a1:3f:9b:6e:ae\nDHCP6.OPTION[2]: dhcp6_name_servers = 2001:db8::68f1:14ff:fe28:32e6\nDHCP6.OPTION[3]: fqdn_fqdn = ip-10-31-11-217\nDHCP6.OPTION[4]: iaid = 8c:3b:13:c0\nDHCP6.OPTION[5]: ip6_address = 2001:db8::86", "stderr": "", "rc": 0, "cmd": ["nmcli", "c", "show", "--active", "bond0"], "start": "2024-09-20 13:28:46.940548", "end": "2024-09-20 13:28:46.959580", "delta": "0:00:00.019032", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli c show --active bond0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 13273 1726853326.97777: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 13273 1726853326.97780: stdout chunk (state=3): >>><<< 13273 1726853326.97783: stderr chunk (state=3): >>><<< 13273 1726853326.97981: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "connection.id: bond0\nconnection.uuid: 3903334e-7358-4806-a114-5ea6dbf2cacf\nconnection.stable-id: --\nconnection.type: bond\nconnection.interface-name: nm-bond\nconnection.autoconnect: yes\nconnection.autoconnect-priority: 0\nconnection.autoconnect-retries: -1 (default)\nconnection.multi-connect: 0 (default)\nconnection.auth-retries: -1\nconnection.timestamp: 1726853319\nconnection.permissions: --\nconnection.zone: --\nconnection.controller: --\nconnection.master: --\nconnection.slave-type: --\nconnection.port-type: --\nconnection.autoconnect-slaves: -1 (default)\nconnection.autoconnect-ports: -1 (default)\nconnection.down-on-poweroff: -1 (default)\nconnection.secondaries: --\nconnection.gateway-ping-timeout: 0\nconnection.metered: unknown\nconnection.lldp: default\nconnection.mdns: -1 (default)\nconnection.llmnr: -1 (default)\nconnection.dns-over-tls: -1 (default)\nconnection.mptcp-flags: 0x0 (default)\nconnection.wait-device-timeout: -1\nconnection.wait-activation-delay: -1\nipv4.method: auto\nipv4.dns: --\nipv4.dns-search: --\nipv4.dns-options: --\nipv4.dns-priority: 0\nipv4.addresses: --\nipv4.gateway: --\nipv4.routes: --\nipv4.route-metric: 65535\nipv4.route-table: 0 (unspec)\nipv4.routing-rules: --\nipv4.replace-local-rule: -1 (default)\nipv4.dhcp-send-release: -1 (default)\nipv4.ignore-auto-routes: no\nipv4.ignore-auto-dns: no\nipv4.dhcp-client-id: --\nipv4.dhcp-iaid: --\nipv4.dhcp-dscp: --\nipv4.dhcp-timeout: 0 (default)\nipv4.dhcp-send-hostname: yes\nipv4.dhcp-hostname: --\nipv4.dhcp-fqdn: --\nipv4.dhcp-hostname-flags: 0x0 (none)\nipv4.never-default: no\nipv4.may-fail: yes\nipv4.required-timeout: -1 (default)\nipv4.dad-timeout: -1 (default)\nipv4.dhcp-vendor-class-identifier: --\nipv4.link-local: 0 (default)\nipv4.dhcp-reject-servers: --\nipv4.auto-route-ext-gw: -1 (default)\nipv6.method: auto\nipv6.dns: --\nipv6.dns-search: --\nipv6.dns-options: --\nipv6.dns-priority: 0\nipv6.addresses: --\nipv6.gateway: --\nipv6.routes: --\nipv6.route-metric: -1\nipv6.route-table: 0 (unspec)\nipv6.routing-rules: --\nipv6.replace-local-rule: -1 (default)\nipv6.dhcp-send-release: -1 (default)\nipv6.ignore-auto-routes: no\nipv6.ignore-auto-dns: no\nipv6.never-default: no\nipv6.may-fail: yes\nipv6.required-timeout: -1 (default)\nipv6.ip6-privacy: -1 (default)\nipv6.temp-valid-lifetime: 0 (default)\nipv6.temp-preferred-lifetime: 0 (default)\nipv6.addr-gen-mode: default\nipv6.ra-timeout: 0 (default)\nipv6.mtu: auto\nipv6.dhcp-pd-hint: --\nipv6.dhcp-duid: --\nipv6.dhcp-iaid: --\nipv6.dhcp-timeout: 0 (default)\nipv6.dhcp-send-hostname: yes\nipv6.dhcp-hostname: --\nipv6.dhcp-hostname-flags: 0x0 (none)\nipv6.auto-route-ext-gw: -1 (default)\nipv6.token: --\nbond.options: mode=active-backup,miimon=110\nproxy.method: none\nproxy.browser-only: no\nproxy.pac-url: --\nproxy.pac-script: --\nGENERAL.NAME: bond0\nGENERAL.UUID: 3903334e-7358-4806-a114-5ea6dbf2cacf\nGENERAL.DEVICES: nm-bond\nGENERAL.IP-IFACE: nm-bond\nGENERAL.STATE: activated\nGENERAL.DEFAULT: no\nGENERAL.DEFAULT6: yes\nGENERAL.SPEC-OBJECT: --\nGENERAL.VPN: no\nGENERAL.DBUS-PATH: /org/freedesktop/NetworkManager/ActiveConnection/28\nGENERAL.CON-PATH: /org/freedesktop/NetworkManager/Settings/23\nGENERAL.ZONE: --\nGENERAL.MASTER-PATH: --\nIP4.ADDRESS[1]: 192.0.2.225/24\nIP4.GATEWAY: 192.0.2.1\nIP4.ROUTE[1]: dst = 0.0.0.0/0, nh = 192.0.2.1, mt = 65535\nIP4.ROUTE[2]: dst = 192.0.2.0/24, nh = 0.0.0.0, mt = 65535\nIP4.DNS[1]: 192.0.2.1\nDHCP4.OPTION[1]: broadcast_address = 192.0.2.255\nDHCP4.OPTION[2]: dhcp_client_identifier = 01:3a:0b:9d:f3:04:69\nDHCP4.OPTION[3]: dhcp_lease_time = 240\nDHCP4.OPTION[4]: dhcp_server_identifier = 192.0.2.1\nDHCP4.OPTION[5]: domain_name_servers = 192.0.2.1\nDHCP4.OPTION[6]: expiry = 1726853559\nDHCP4.OPTION[7]: host_name = ip-10-31-11-217\nDHCP4.OPTION[8]: ip_address = 192.0.2.225\nDHCP4.OPTION[9]: next_server = 192.0.2.1\nDHCP4.OPTION[10]: requested_broadcast_address = 1\nDHCP4.OPTION[11]: requested_domain_name = 1\nDHCP4.OPTION[12]: requested_domain_name_servers = 1\nDHCP4.OPTION[13]: requested_domain_search = 1\nDHCP4.OPTION[14]: requested_host_name = 1\nDHCP4.OPTION[15]: requested_interface_mtu = 1\nDHCP4.OPTION[16]: requested_ms_classless_static_routes = 1\nDHCP4.OPTION[17]: requested_nis_domain = 1\nDHCP4.OPTION[18]: requested_nis_servers = 1\nDHCP4.OPTION[19]: requested_ntp_servers = 1\nDHCP4.OPTION[20]: requested_rfc3442_classless_static_routes = 1\nDHCP4.OPTION[21]: requested_root_path = 1\nDHCP4.OPTION[22]: requested_routers = 1\nDHCP4.OPTION[23]: requested_static_routes = 1\nDHCP4.OPTION[24]: requested_subnet_mask = 1\nDHCP4.OPTION[25]: requested_time_offset = 1\nDHCP4.OPTION[26]: requested_wpad = 1\nDHCP4.OPTION[27]: routers = 192.0.2.1\nDHCP4.OPTION[28]: subnet_mask = 255.255.255.0\nIP6.ADDRESS[1]: 2001:db8::86/128\nIP6.ADDRESS[2]: 2001:db8::380b:9dff:fef3:469/64\nIP6.ADDRESS[3]: fe80::380b:9dff:fef3:469/64\nIP6.GATEWAY: fe80::b48a:8aff:feef:317f\nIP6.ROUTE[1]: dst = 2001:db8::86/128, nh = ::, mt = 300\nIP6.ROUTE[2]: dst = 2001:db8::/64, nh = ::, mt = 300\nIP6.ROUTE[3]: dst = fe80::/64, nh = ::, mt = 1024\nIP6.ROUTE[4]: dst = ::/0, nh = fe80::b48a:8aff:feef:317f, mt = 300\nIP6.DNS[1]: 2001:db8::68f1:14ff:fe28:32e6\nIP6.DNS[2]: fe80::b48a:8aff:feef:317f\nDHCP6.OPTION[1]: dhcp6_client_id = 00:04:63:43:fa:a7:0c:5b:23:53:97:5b:3b:a1:3f:9b:6e:ae\nDHCP6.OPTION[2]: dhcp6_name_servers = 2001:db8::68f1:14ff:fe28:32e6\nDHCP6.OPTION[3]: fqdn_fqdn = ip-10-31-11-217\nDHCP6.OPTION[4]: iaid = 8c:3b:13:c0\nDHCP6.OPTION[5]: ip6_address = 2001:db8::86", "stderr": "", "rc": 0, "cmd": ["nmcli", "c", "show", "--active", "bond0"], "start": "2024-09-20 13:28:46.940548", "end": "2024-09-20 13:28:46.959580", "delta": "0:00:00.019032", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli c show --active bond0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 13273 1726853326.97990: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli c show --active bond0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853326.685306-15310-154728173039734/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13273 1726853326.97993: _low_level_execute_command(): starting 13273 1726853326.97996: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853326.685306-15310-154728173039734/ > /dev/null 2>&1 && sleep 0' 13273 1726853326.99222: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13273 1726853326.99244: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853326.99345: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853326.99395: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853326.99442: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853326.99479: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853326.99744: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853327.01585: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853327.01607: stderr chunk (state=3): >>><<< 13273 1726853327.01610: stdout chunk (state=3): >>><<< 13273 1726853327.01628: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853327.01631: handler run complete 13273 1726853327.01658: Evaluated conditional (False): False 13273 1726853327.01667: attempt loop complete, returning result 13273 1726853327.01670: _execute() done 13273 1726853327.01674: dumping result to json 13273 1726853327.01688: done dumping result, returning 13273 1726853327.01695: done running TaskExecutor() for managed_node3/TASK: From the active connection, get the controller profile "bond0" [02083763-bbaf-5fc3-657d-000000000160] 13273 1726853327.01699: sending task result for task 02083763-bbaf-5fc3-657d-000000000160 13273 1726853327.01864: done sending task result for task 02083763-bbaf-5fc3-657d-000000000160 13273 1726853327.01867: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": [ "nmcli", "c", "show", "--active", "bond0" ], "delta": "0:00:00.019032", "end": "2024-09-20 13:28:46.959580", "rc": 0, "start": "2024-09-20 13:28:46.940548" } STDOUT: connection.id: bond0 connection.uuid: 3903334e-7358-4806-a114-5ea6dbf2cacf connection.stable-id: -- connection.type: bond connection.interface-name: nm-bond connection.autoconnect: yes connection.autoconnect-priority: 0 connection.autoconnect-retries: -1 (default) connection.multi-connect: 0 (default) connection.auth-retries: -1 connection.timestamp: 1726853319 connection.permissions: -- connection.zone: -- connection.controller: -- connection.master: -- connection.slave-type: -- connection.port-type: -- connection.autoconnect-slaves: -1 (default) connection.autoconnect-ports: -1 (default) connection.down-on-poweroff: -1 (default) connection.secondaries: -- connection.gateway-ping-timeout: 0 connection.metered: unknown connection.lldp: default connection.mdns: -1 (default) connection.llmnr: -1 (default) connection.dns-over-tls: -1 (default) connection.mptcp-flags: 0x0 (default) connection.wait-device-timeout: -1 connection.wait-activation-delay: -1 ipv4.method: auto ipv4.dns: -- ipv4.dns-search: -- ipv4.dns-options: -- ipv4.dns-priority: 0 ipv4.addresses: -- ipv4.gateway: -- ipv4.routes: -- ipv4.route-metric: 65535 ipv4.route-table: 0 (unspec) ipv4.routing-rules: -- ipv4.replace-local-rule: -1 (default) ipv4.dhcp-send-release: -1 (default) ipv4.ignore-auto-routes: no ipv4.ignore-auto-dns: no ipv4.dhcp-client-id: -- ipv4.dhcp-iaid: -- ipv4.dhcp-dscp: -- ipv4.dhcp-timeout: 0 (default) ipv4.dhcp-send-hostname: yes ipv4.dhcp-hostname: -- ipv4.dhcp-fqdn: -- ipv4.dhcp-hostname-flags: 0x0 (none) ipv4.never-default: no ipv4.may-fail: yes ipv4.required-timeout: -1 (default) ipv4.dad-timeout: -1 (default) ipv4.dhcp-vendor-class-identifier: -- ipv4.link-local: 0 (default) ipv4.dhcp-reject-servers: -- ipv4.auto-route-ext-gw: -1 (default) ipv6.method: auto ipv6.dns: -- ipv6.dns-search: -- ipv6.dns-options: -- ipv6.dns-priority: 0 ipv6.addresses: -- ipv6.gateway: -- ipv6.routes: -- ipv6.route-metric: -1 ipv6.route-table: 0 (unspec) ipv6.routing-rules: -- ipv6.replace-local-rule: -1 (default) ipv6.dhcp-send-release: -1 (default) ipv6.ignore-auto-routes: no ipv6.ignore-auto-dns: no ipv6.never-default: no ipv6.may-fail: yes ipv6.required-timeout: -1 (default) ipv6.ip6-privacy: -1 (default) ipv6.temp-valid-lifetime: 0 (default) ipv6.temp-preferred-lifetime: 0 (default) ipv6.addr-gen-mode: default ipv6.ra-timeout: 0 (default) ipv6.mtu: auto ipv6.dhcp-pd-hint: -- ipv6.dhcp-duid: -- ipv6.dhcp-iaid: -- ipv6.dhcp-timeout: 0 (default) ipv6.dhcp-send-hostname: yes ipv6.dhcp-hostname: -- ipv6.dhcp-hostname-flags: 0x0 (none) ipv6.auto-route-ext-gw: -1 (default) ipv6.token: -- bond.options: mode=active-backup,miimon=110 proxy.method: none proxy.browser-only: no proxy.pac-url: -- proxy.pac-script: -- GENERAL.NAME: bond0 GENERAL.UUID: 3903334e-7358-4806-a114-5ea6dbf2cacf GENERAL.DEVICES: nm-bond GENERAL.IP-IFACE: nm-bond GENERAL.STATE: activated GENERAL.DEFAULT: no GENERAL.DEFAULT6: yes GENERAL.SPEC-OBJECT: -- GENERAL.VPN: no GENERAL.DBUS-PATH: /org/freedesktop/NetworkManager/ActiveConnection/28 GENERAL.CON-PATH: /org/freedesktop/NetworkManager/Settings/23 GENERAL.ZONE: -- GENERAL.MASTER-PATH: -- IP4.ADDRESS[1]: 192.0.2.225/24 IP4.GATEWAY: 192.0.2.1 IP4.ROUTE[1]: dst = 0.0.0.0/0, nh = 192.0.2.1, mt = 65535 IP4.ROUTE[2]: dst = 192.0.2.0/24, nh = 0.0.0.0, mt = 65535 IP4.DNS[1]: 192.0.2.1 DHCP4.OPTION[1]: broadcast_address = 192.0.2.255 DHCP4.OPTION[2]: dhcp_client_identifier = 01:3a:0b:9d:f3:04:69 DHCP4.OPTION[3]: dhcp_lease_time = 240 DHCP4.OPTION[4]: dhcp_server_identifier = 192.0.2.1 DHCP4.OPTION[5]: domain_name_servers = 192.0.2.1 DHCP4.OPTION[6]: expiry = 1726853559 DHCP4.OPTION[7]: host_name = ip-10-31-11-217 DHCP4.OPTION[8]: ip_address = 192.0.2.225 DHCP4.OPTION[9]: next_server = 192.0.2.1 DHCP4.OPTION[10]: requested_broadcast_address = 1 DHCP4.OPTION[11]: requested_domain_name = 1 DHCP4.OPTION[12]: requested_domain_name_servers = 1 DHCP4.OPTION[13]: requested_domain_search = 1 DHCP4.OPTION[14]: requested_host_name = 1 DHCP4.OPTION[15]: requested_interface_mtu = 1 DHCP4.OPTION[16]: requested_ms_classless_static_routes = 1 DHCP4.OPTION[17]: requested_nis_domain = 1 DHCP4.OPTION[18]: requested_nis_servers = 1 DHCP4.OPTION[19]: requested_ntp_servers = 1 DHCP4.OPTION[20]: requested_rfc3442_classless_static_routes = 1 DHCP4.OPTION[21]: requested_root_path = 1 DHCP4.OPTION[22]: requested_routers = 1 DHCP4.OPTION[23]: requested_static_routes = 1 DHCP4.OPTION[24]: requested_subnet_mask = 1 DHCP4.OPTION[25]: requested_time_offset = 1 DHCP4.OPTION[26]: requested_wpad = 1 DHCP4.OPTION[27]: routers = 192.0.2.1 DHCP4.OPTION[28]: subnet_mask = 255.255.255.0 IP6.ADDRESS[1]: 2001:db8::86/128 IP6.ADDRESS[2]: 2001:db8::380b:9dff:fef3:469/64 IP6.ADDRESS[3]: fe80::380b:9dff:fef3:469/64 IP6.GATEWAY: fe80::b48a:8aff:feef:317f IP6.ROUTE[1]: dst = 2001:db8::86/128, nh = ::, mt = 300 IP6.ROUTE[2]: dst = 2001:db8::/64, nh = ::, mt = 300 IP6.ROUTE[3]: dst = fe80::/64, nh = ::, mt = 1024 IP6.ROUTE[4]: dst = ::/0, nh = fe80::b48a:8aff:feef:317f, mt = 300 IP6.DNS[1]: 2001:db8::68f1:14ff:fe28:32e6 IP6.DNS[2]: fe80::b48a:8aff:feef:317f DHCP6.OPTION[1]: dhcp6_client_id = 00:04:63:43:fa:a7:0c:5b:23:53:97:5b:3b:a1:3f:9b:6e:ae DHCP6.OPTION[2]: dhcp6_name_servers = 2001:db8::68f1:14ff:fe28:32e6 DHCP6.OPTION[3]: fqdn_fqdn = ip-10-31-11-217 DHCP6.OPTION[4]: iaid = 8c:3b:13:c0 DHCP6.OPTION[5]: ip6_address = 2001:db8::86 13273 1726853327.01993: no more pending results, returning what we have 13273 1726853327.01997: results queue empty 13273 1726853327.01998: checking for any_errors_fatal 13273 1726853327.02000: done checking for any_errors_fatal 13273 1726853327.02001: checking for max_fail_percentage 13273 1726853327.02002: done checking for max_fail_percentage 13273 1726853327.02003: checking to see if all hosts have failed and the running result is not ok 13273 1726853327.02004: done checking to see if all hosts have failed 13273 1726853327.02004: getting the remaining hosts for this loop 13273 1726853327.02006: done getting the remaining hosts for this loop 13273 1726853327.02009: getting the next task for host managed_node3 13273 1726853327.02014: done getting next task for host managed_node3 13273 1726853327.02016: ^ task is: TASK: Assert that the controller profile is activated 13273 1726853327.02018: ^ state is: HOST STATE: block=2, task=32, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853327.02021: getting variables 13273 1726853327.02023: in VariableManager get_vars() 13273 1726853327.02068: Calling all_inventory to load vars for managed_node3 13273 1726853327.02078: Calling groups_inventory to load vars for managed_node3 13273 1726853327.02081: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853327.02090: Calling all_plugins_play to load vars for managed_node3 13273 1726853327.02092: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853327.02094: Calling groups_plugins_play to load vars for managed_node3 13273 1726853327.05007: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853327.06676: done with get_vars() 13273 1726853327.06704: done getting variables 13273 1726853327.06765: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Assert that the controller profile is activated] ************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:207 Friday 20 September 2024 13:28:47 -0400 (0:00:00.430) 0:00:44.957 ****** 13273 1726853327.06803: entering _queue_task() for managed_node3/assert 13273 1726853327.07303: worker is 1 (out of 1 available) 13273 1726853327.07321: exiting _queue_task() for managed_node3/assert 13273 1726853327.07334: done queuing things up, now waiting for results queue to drain 13273 1726853327.07455: waiting for pending results... 13273 1726853327.07831: running TaskExecutor() for managed_node3/TASK: Assert that the controller profile is activated 13273 1726853327.08178: in run() - task 02083763-bbaf-5fc3-657d-000000000161 13273 1726853327.08184: variable 'ansible_search_path' from source: unknown 13273 1726853327.08375: calling self._execute() 13273 1726853327.08512: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853327.08518: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853327.08531: variable 'omit' from source: magic vars 13273 1726853327.09536: variable 'ansible_distribution_major_version' from source: facts 13273 1726853327.09633: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853327.09817: variable 'network_provider' from source: set_fact 13273 1726853327.09836: Evaluated conditional (network_provider == "nm"): True 13273 1726853327.09847: variable 'omit' from source: magic vars 13273 1726853327.09877: variable 'omit' from source: magic vars 13273 1726853327.10046: variable 'controller_profile' from source: play vars 13273 1726853327.10051: variable 'omit' from source: magic vars 13273 1726853327.10074: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13273 1726853327.10121: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13273 1726853327.10149: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13273 1726853327.10165: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853327.10190: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853327.10214: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13273 1726853327.10218: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853327.10220: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853327.10290: Set connection var ansible_connection to ssh 13273 1726853327.10298: Set connection var ansible_module_compression to ZIP_DEFLATED 13273 1726853327.10329: Set connection var ansible_shell_executable to /bin/sh 13273 1726853327.10333: Set connection var ansible_shell_type to sh 13273 1726853327.10336: Set connection var ansible_pipelining to False 13273 1726853327.10338: Set connection var ansible_timeout to 10 13273 1726853327.10350: variable 'ansible_shell_executable' from source: unknown 13273 1726853327.10353: variable 'ansible_connection' from source: unknown 13273 1726853327.10356: variable 'ansible_module_compression' from source: unknown 13273 1726853327.10358: variable 'ansible_shell_type' from source: unknown 13273 1726853327.10361: variable 'ansible_shell_executable' from source: unknown 13273 1726853327.10363: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853327.10367: variable 'ansible_pipelining' from source: unknown 13273 1726853327.10369: variable 'ansible_timeout' from source: unknown 13273 1726853327.10374: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853327.10530: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13273 1726853327.10534: variable 'omit' from source: magic vars 13273 1726853327.10541: starting attempt loop 13273 1726853327.10546: running the handler 13273 1726853327.10682: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13273 1726853327.12488: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13273 1726853327.12529: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13273 1726853327.12566: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13273 1726853327.12592: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13273 1726853327.12611: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13273 1726853327.12662: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853327.12686: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853327.12704: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853327.12729: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853327.12739: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853327.12815: variable 'active_controller_profile' from source: set_fact 13273 1726853327.12837: Evaluated conditional (active_controller_profile.stdout | length != 0): True 13273 1726853327.12843: handler run complete 13273 1726853327.12855: attempt loop complete, returning result 13273 1726853327.12858: _execute() done 13273 1726853327.12861: dumping result to json 13273 1726853327.12864: done dumping result, returning 13273 1726853327.12870: done running TaskExecutor() for managed_node3/TASK: Assert that the controller profile is activated [02083763-bbaf-5fc3-657d-000000000161] 13273 1726853327.12876: sending task result for task 02083763-bbaf-5fc3-657d-000000000161 13273 1726853327.12965: done sending task result for task 02083763-bbaf-5fc3-657d-000000000161 13273 1726853327.12968: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 13273 1726853327.13043: no more pending results, returning what we have 13273 1726853327.13049: results queue empty 13273 1726853327.13050: checking for any_errors_fatal 13273 1726853327.13059: done checking for any_errors_fatal 13273 1726853327.13060: checking for max_fail_percentage 13273 1726853327.13062: done checking for max_fail_percentage 13273 1726853327.13062: checking to see if all hosts have failed and the running result is not ok 13273 1726853327.13063: done checking to see if all hosts have failed 13273 1726853327.13064: getting the remaining hosts for this loop 13273 1726853327.13066: done getting the remaining hosts for this loop 13273 1726853327.13069: getting the next task for host managed_node3 13273 1726853327.13076: done getting next task for host managed_node3 13273 1726853327.13079: ^ task is: TASK: Get the controller device details 13273 1726853327.13080: ^ state is: HOST STATE: block=2, task=33, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853327.13083: getting variables 13273 1726853327.13092: in VariableManager get_vars() 13273 1726853327.13141: Calling all_inventory to load vars for managed_node3 13273 1726853327.13144: Calling groups_inventory to load vars for managed_node3 13273 1726853327.13148: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853327.13157: Calling all_plugins_play to load vars for managed_node3 13273 1726853327.13160: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853327.13162: Calling groups_plugins_play to load vars for managed_node3 13273 1726853327.13969: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853327.14967: done with get_vars() 13273 1726853327.15004: done getting variables 13273 1726853327.15096: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get the controller device details] *************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:214 Friday 20 September 2024 13:28:47 -0400 (0:00:00.083) 0:00:45.040 ****** 13273 1726853327.15132: entering _queue_task() for managed_node3/command 13273 1726853327.15468: worker is 1 (out of 1 available) 13273 1726853327.15482: exiting _queue_task() for managed_node3/command 13273 1726853327.15495: done queuing things up, now waiting for results queue to drain 13273 1726853327.15497: waiting for pending results... 13273 1726853327.15839: running TaskExecutor() for managed_node3/TASK: Get the controller device details 13273 1726853327.15963: in run() - task 02083763-bbaf-5fc3-657d-000000000162 13273 1726853327.15968: variable 'ansible_search_path' from source: unknown 13273 1726853327.15973: calling self._execute() 13273 1726853327.16083: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853327.16093: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853327.16139: variable 'omit' from source: magic vars 13273 1726853327.16491: variable 'ansible_distribution_major_version' from source: facts 13273 1726853327.16503: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853327.16591: variable 'network_provider' from source: set_fact 13273 1726853327.16616: Evaluated conditional (network_provider == "initscripts"): False 13273 1726853327.16621: when evaluation is False, skipping this task 13273 1726853327.16629: _execute() done 13273 1726853327.16634: dumping result to json 13273 1726853327.16637: done dumping result, returning 13273 1726853327.16639: done running TaskExecutor() for managed_node3/TASK: Get the controller device details [02083763-bbaf-5fc3-657d-000000000162] 13273 1726853327.16642: sending task result for task 02083763-bbaf-5fc3-657d-000000000162 13273 1726853327.16772: done sending task result for task 02083763-bbaf-5fc3-657d-000000000162 13273 1726853327.16776: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 13273 1726853327.16835: no more pending results, returning what we have 13273 1726853327.16838: results queue empty 13273 1726853327.16839: checking for any_errors_fatal 13273 1726853327.16844: done checking for any_errors_fatal 13273 1726853327.16848: checking for max_fail_percentage 13273 1726853327.16849: done checking for max_fail_percentage 13273 1726853327.16850: checking to see if all hosts have failed and the running result is not ok 13273 1726853327.16851: done checking to see if all hosts have failed 13273 1726853327.16851: getting the remaining hosts for this loop 13273 1726853327.16852: done getting the remaining hosts for this loop 13273 1726853327.16855: getting the next task for host managed_node3 13273 1726853327.16860: done getting next task for host managed_node3 13273 1726853327.16863: ^ task is: TASK: Assert that the controller profile is activated 13273 1726853327.16865: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853327.16868: getting variables 13273 1726853327.16869: in VariableManager get_vars() 13273 1726853327.16917: Calling all_inventory to load vars for managed_node3 13273 1726853327.16920: Calling groups_inventory to load vars for managed_node3 13273 1726853327.16922: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853327.16931: Calling all_plugins_play to load vars for managed_node3 13273 1726853327.16938: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853327.16950: Calling groups_plugins_play to load vars for managed_node3 13273 1726853327.17996: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853327.18867: done with get_vars() 13273 1726853327.18884: done getting variables 13273 1726853327.18922: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Assert that the controller profile is activated] ************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:221 Friday 20 September 2024 13:28:47 -0400 (0:00:00.038) 0:00:45.078 ****** 13273 1726853327.18941: entering _queue_task() for managed_node3/assert 13273 1726853327.19240: worker is 1 (out of 1 available) 13273 1726853327.19257: exiting _queue_task() for managed_node3/assert 13273 1726853327.19273: done queuing things up, now waiting for results queue to drain 13273 1726853327.19274: waiting for pending results... 13273 1726853327.19463: running TaskExecutor() for managed_node3/TASK: Assert that the controller profile is activated 13273 1726853327.19538: in run() - task 02083763-bbaf-5fc3-657d-000000000163 13273 1726853327.19552: variable 'ansible_search_path' from source: unknown 13273 1726853327.19581: calling self._execute() 13273 1726853327.19655: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853327.19667: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853327.19677: variable 'omit' from source: magic vars 13273 1726853327.19953: variable 'ansible_distribution_major_version' from source: facts 13273 1726853327.19962: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853327.20044: variable 'network_provider' from source: set_fact 13273 1726853327.20050: Evaluated conditional (network_provider == "initscripts"): False 13273 1726853327.20053: when evaluation is False, skipping this task 13273 1726853327.20056: _execute() done 13273 1726853327.20058: dumping result to json 13273 1726853327.20061: done dumping result, returning 13273 1726853327.20065: done running TaskExecutor() for managed_node3/TASK: Assert that the controller profile is activated [02083763-bbaf-5fc3-657d-000000000163] 13273 1726853327.20072: sending task result for task 02083763-bbaf-5fc3-657d-000000000163 13273 1726853327.20160: done sending task result for task 02083763-bbaf-5fc3-657d-000000000163 13273 1726853327.20162: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 13273 1726853327.20234: no more pending results, returning what we have 13273 1726853327.20237: results queue empty 13273 1726853327.20238: checking for any_errors_fatal 13273 1726853327.20242: done checking for any_errors_fatal 13273 1726853327.20243: checking for max_fail_percentage 13273 1726853327.20244: done checking for max_fail_percentage 13273 1726853327.20247: checking to see if all hosts have failed and the running result is not ok 13273 1726853327.20248: done checking to see if all hosts have failed 13273 1726853327.20249: getting the remaining hosts for this loop 13273 1726853327.20250: done getting the remaining hosts for this loop 13273 1726853327.20252: getting the next task for host managed_node3 13273 1726853327.20261: done getting next task for host managed_node3 13273 1726853327.20265: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 13273 1726853327.20269: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13273 1726853327.20289: getting variables 13273 1726853327.20290: in VariableManager get_vars() 13273 1726853327.20331: Calling all_inventory to load vars for managed_node3 13273 1726853327.20333: Calling groups_inventory to load vars for managed_node3 13273 1726853327.20336: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853327.20343: Calling all_plugins_play to load vars for managed_node3 13273 1726853327.20347: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853327.20350: Calling groups_plugins_play to load vars for managed_node3 13273 1726853327.21394: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853327.22515: done with get_vars() 13273 1726853327.22533: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 13:28:47 -0400 (0:00:00.036) 0:00:45.115 ****** 13273 1726853327.22619: entering _queue_task() for managed_node3/include_tasks 13273 1726853327.22899: worker is 1 (out of 1 available) 13273 1726853327.22911: exiting _queue_task() for managed_node3/include_tasks 13273 1726853327.22923: done queuing things up, now waiting for results queue to drain 13273 1726853327.22924: waiting for pending results... 13273 1726853327.23236: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 13273 1726853327.23301: in run() - task 02083763-bbaf-5fc3-657d-00000000016c 13273 1726853327.23311: variable 'ansible_search_path' from source: unknown 13273 1726853327.23314: variable 'ansible_search_path' from source: unknown 13273 1726853327.23361: calling self._execute() 13273 1726853327.23427: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853327.23433: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853327.23441: variable 'omit' from source: magic vars 13273 1726853327.23736: variable 'ansible_distribution_major_version' from source: facts 13273 1726853327.23748: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853327.23752: _execute() done 13273 1726853327.23755: dumping result to json 13273 1726853327.23757: done dumping result, returning 13273 1726853327.23764: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [02083763-bbaf-5fc3-657d-00000000016c] 13273 1726853327.23767: sending task result for task 02083763-bbaf-5fc3-657d-00000000016c 13273 1726853327.23858: done sending task result for task 02083763-bbaf-5fc3-657d-00000000016c 13273 1726853327.23861: WORKER PROCESS EXITING 13273 1726853327.23915: no more pending results, returning what we have 13273 1726853327.23919: in VariableManager get_vars() 13273 1726853327.23968: Calling all_inventory to load vars for managed_node3 13273 1726853327.23979: Calling groups_inventory to load vars for managed_node3 13273 1726853327.23982: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853327.23990: Calling all_plugins_play to load vars for managed_node3 13273 1726853327.23993: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853327.23995: Calling groups_plugins_play to load vars for managed_node3 13273 1726853327.25344: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853327.26881: done with get_vars() 13273 1726853327.26899: variable 'ansible_search_path' from source: unknown 13273 1726853327.26901: variable 'ansible_search_path' from source: unknown 13273 1726853327.26936: we have included files to process 13273 1726853327.26937: generating all_blocks data 13273 1726853327.26939: done generating all_blocks data 13273 1726853327.26944: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 13273 1726853327.26945: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 13273 1726853327.26947: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 13273 1726853327.27690: done processing included file 13273 1726853327.27692: iterating over new_blocks loaded from include file 13273 1726853327.27694: in VariableManager get_vars() 13273 1726853327.27740: done with get_vars() 13273 1726853327.27743: filtering new block on tags 13273 1726853327.27781: done filtering new block on tags 13273 1726853327.27784: in VariableManager get_vars() 13273 1726853327.27820: done with get_vars() 13273 1726853327.27823: filtering new block on tags 13273 1726853327.27868: done filtering new block on tags 13273 1726853327.27875: in VariableManager get_vars() 13273 1726853327.27907: done with get_vars() 13273 1726853327.27909: filtering new block on tags 13273 1726853327.27955: done filtering new block on tags 13273 1726853327.27957: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node3 13273 1726853327.27962: extending task lists for all hosts with included blocks 13273 1726853327.28611: done extending task lists 13273 1726853327.28612: done processing included files 13273 1726853327.28613: results queue empty 13273 1726853327.28613: checking for any_errors_fatal 13273 1726853327.28616: done checking for any_errors_fatal 13273 1726853327.28617: checking for max_fail_percentage 13273 1726853327.28618: done checking for max_fail_percentage 13273 1726853327.28618: checking to see if all hosts have failed and the running result is not ok 13273 1726853327.28618: done checking to see if all hosts have failed 13273 1726853327.28619: getting the remaining hosts for this loop 13273 1726853327.28620: done getting the remaining hosts for this loop 13273 1726853327.28621: getting the next task for host managed_node3 13273 1726853327.28625: done getting next task for host managed_node3 13273 1726853327.28627: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 13273 1726853327.28629: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13273 1726853327.28638: getting variables 13273 1726853327.28638: in VariableManager get_vars() 13273 1726853327.28653: Calling all_inventory to load vars for managed_node3 13273 1726853327.28655: Calling groups_inventory to load vars for managed_node3 13273 1726853327.28656: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853327.28659: Calling all_plugins_play to load vars for managed_node3 13273 1726853327.28661: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853327.28663: Calling groups_plugins_play to load vars for managed_node3 13273 1726853327.29314: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853327.30722: done with get_vars() 13273 1726853327.30739: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 13:28:47 -0400 (0:00:00.081) 0:00:45.197 ****** 13273 1726853327.30817: entering _queue_task() for managed_node3/setup 13273 1726853327.31148: worker is 1 (out of 1 available) 13273 1726853327.31160: exiting _queue_task() for managed_node3/setup 13273 1726853327.31174: done queuing things up, now waiting for results queue to drain 13273 1726853327.31175: waiting for pending results... 13273 1726853327.31591: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 13273 1726853327.31653: in run() - task 02083763-bbaf-5fc3-657d-000000000914 13273 1726853327.31676: variable 'ansible_search_path' from source: unknown 13273 1726853327.31684: variable 'ansible_search_path' from source: unknown 13273 1726853327.31727: calling self._execute() 13273 1726853327.31840: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853327.31856: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853327.31873: variable 'omit' from source: magic vars 13273 1726853327.32288: variable 'ansible_distribution_major_version' from source: facts 13273 1726853327.32303: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853327.32779: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13273 1726853327.36024: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13273 1726853327.36106: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13273 1726853327.36157: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13273 1726853327.36203: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13273 1726853327.36244: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13273 1726853327.36323: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853327.36372: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853327.36415: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853327.36461: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853327.36488: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853327.36557: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853327.36589: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853327.36686: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853327.36689: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853327.36691: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853327.36853: variable '__network_required_facts' from source: role '' defaults 13273 1726853327.36867: variable 'ansible_facts' from source: unknown 13273 1726853327.37696: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 13273 1726853327.37707: when evaluation is False, skipping this task 13273 1726853327.37714: _execute() done 13273 1726853327.37721: dumping result to json 13273 1726853327.37728: done dumping result, returning 13273 1726853327.37740: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [02083763-bbaf-5fc3-657d-000000000914] 13273 1726853327.37753: sending task result for task 02083763-bbaf-5fc3-657d-000000000914 skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 13273 1726853327.37922: no more pending results, returning what we have 13273 1726853327.37926: results queue empty 13273 1726853327.37927: checking for any_errors_fatal 13273 1726853327.37929: done checking for any_errors_fatal 13273 1726853327.37930: checking for max_fail_percentage 13273 1726853327.37932: done checking for max_fail_percentage 13273 1726853327.37933: checking to see if all hosts have failed and the running result is not ok 13273 1726853327.37933: done checking to see if all hosts have failed 13273 1726853327.37934: getting the remaining hosts for this loop 13273 1726853327.37936: done getting the remaining hosts for this loop 13273 1726853327.37940: getting the next task for host managed_node3 13273 1726853327.37955: done getting next task for host managed_node3 13273 1726853327.37959: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 13273 1726853327.37966: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13273 1726853327.37992: getting variables 13273 1726853327.37994: in VariableManager get_vars() 13273 1726853327.38160: Calling all_inventory to load vars for managed_node3 13273 1726853327.38164: Calling groups_inventory to load vars for managed_node3 13273 1726853327.38166: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853327.38178: Calling all_plugins_play to load vars for managed_node3 13273 1726853327.38182: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853327.38185: Calling groups_plugins_play to load vars for managed_node3 13273 1726853327.39104: done sending task result for task 02083763-bbaf-5fc3-657d-000000000914 13273 1726853327.39107: WORKER PROCESS EXITING 13273 1726853327.40054: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853327.41983: done with get_vars() 13273 1726853327.42005: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 13:28:47 -0400 (0:00:00.113) 0:00:45.310 ****** 13273 1726853327.42141: entering _queue_task() for managed_node3/stat 13273 1726853327.42854: worker is 1 (out of 1 available) 13273 1726853327.42866: exiting _queue_task() for managed_node3/stat 13273 1726853327.42879: done queuing things up, now waiting for results queue to drain 13273 1726853327.42880: waiting for pending results... 13273 1726853327.43287: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree 13273 1726853327.43785: in run() - task 02083763-bbaf-5fc3-657d-000000000916 13273 1726853327.43800: variable 'ansible_search_path' from source: unknown 13273 1726853327.43805: variable 'ansible_search_path' from source: unknown 13273 1726853327.43892: calling self._execute() 13273 1726853327.44101: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853327.44107: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853327.44117: variable 'omit' from source: magic vars 13273 1726853327.44943: variable 'ansible_distribution_major_version' from source: facts 13273 1726853327.44954: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853327.45916: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13273 1726853327.46386: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13273 1726853327.46430: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13273 1726853327.46559: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13273 1726853327.46703: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13273 1726853327.46784: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13273 1726853327.46811: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13273 1726853327.46836: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853327.46862: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13273 1726853327.47153: variable '__network_is_ostree' from source: set_fact 13273 1726853327.47162: Evaluated conditional (not __network_is_ostree is defined): False 13273 1726853327.47165: when evaluation is False, skipping this task 13273 1726853327.47168: _execute() done 13273 1726853327.47172: dumping result to json 13273 1726853327.47175: done dumping result, returning 13273 1726853327.47184: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree [02083763-bbaf-5fc3-657d-000000000916] 13273 1726853327.47188: sending task result for task 02083763-bbaf-5fc3-657d-000000000916 13273 1726853327.47289: done sending task result for task 02083763-bbaf-5fc3-657d-000000000916 13273 1726853327.47292: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 13273 1726853327.47376: no more pending results, returning what we have 13273 1726853327.47381: results queue empty 13273 1726853327.47382: checking for any_errors_fatal 13273 1726853327.47388: done checking for any_errors_fatal 13273 1726853327.47389: checking for max_fail_percentage 13273 1726853327.47391: done checking for max_fail_percentage 13273 1726853327.47391: checking to see if all hosts have failed and the running result is not ok 13273 1726853327.47392: done checking to see if all hosts have failed 13273 1726853327.47393: getting the remaining hosts for this loop 13273 1726853327.47395: done getting the remaining hosts for this loop 13273 1726853327.47398: getting the next task for host managed_node3 13273 1726853327.47406: done getting next task for host managed_node3 13273 1726853327.47410: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 13273 1726853327.47416: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13273 1726853327.47553: getting variables 13273 1726853327.47556: in VariableManager get_vars() 13273 1726853327.47610: Calling all_inventory to load vars for managed_node3 13273 1726853327.47613: Calling groups_inventory to load vars for managed_node3 13273 1726853327.47616: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853327.47624: Calling all_plugins_play to load vars for managed_node3 13273 1726853327.47627: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853327.47629: Calling groups_plugins_play to load vars for managed_node3 13273 1726853327.50494: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853327.53906: done with get_vars() 13273 1726853327.53937: done getting variables 13273 1726853327.54122: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 13:28:47 -0400 (0:00:00.120) 0:00:45.430 ****** 13273 1726853327.54168: entering _queue_task() for managed_node3/set_fact 13273 1726853327.54930: worker is 1 (out of 1 available) 13273 1726853327.54943: exiting _queue_task() for managed_node3/set_fact 13273 1726853327.55280: done queuing things up, now waiting for results queue to drain 13273 1726853327.55282: waiting for pending results... 13273 1726853327.55590: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 13273 1726853327.56053: in run() - task 02083763-bbaf-5fc3-657d-000000000917 13273 1726853327.56057: variable 'ansible_search_path' from source: unknown 13273 1726853327.56060: variable 'ansible_search_path' from source: unknown 13273 1726853327.56074: calling self._execute() 13273 1726853327.56255: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853327.56268: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853327.56385: variable 'omit' from source: magic vars 13273 1726853327.57355: variable 'ansible_distribution_major_version' from source: facts 13273 1726853327.57358: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853327.57588: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13273 1726853327.58189: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13273 1726853327.58221: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13273 1726853327.58385: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13273 1726853327.58412: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13273 1726853327.58619: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13273 1726853327.58644: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13273 1726853327.58791: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853327.58817: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13273 1726853327.59017: variable '__network_is_ostree' from source: set_fact 13273 1726853327.59026: Evaluated conditional (not __network_is_ostree is defined): False 13273 1726853327.59029: when evaluation is False, skipping this task 13273 1726853327.59032: _execute() done 13273 1726853327.59034: dumping result to json 13273 1726853327.59037: done dumping result, returning 13273 1726853327.59045: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [02083763-bbaf-5fc3-657d-000000000917] 13273 1726853327.59053: sending task result for task 02083763-bbaf-5fc3-657d-000000000917 skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 13273 1726853327.59542: no more pending results, returning what we have 13273 1726853327.59547: results queue empty 13273 1726853327.59549: checking for any_errors_fatal 13273 1726853327.59554: done checking for any_errors_fatal 13273 1726853327.59554: checking for max_fail_percentage 13273 1726853327.59556: done checking for max_fail_percentage 13273 1726853327.59557: checking to see if all hosts have failed and the running result is not ok 13273 1726853327.59557: done checking to see if all hosts have failed 13273 1726853327.59558: getting the remaining hosts for this loop 13273 1726853327.59559: done getting the remaining hosts for this loop 13273 1726853327.59562: getting the next task for host managed_node3 13273 1726853327.59572: done getting next task for host managed_node3 13273 1726853327.59576: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 13273 1726853327.59581: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13273 1726853327.59599: getting variables 13273 1726853327.59600: in VariableManager get_vars() 13273 1726853327.59648: Calling all_inventory to load vars for managed_node3 13273 1726853327.59652: Calling groups_inventory to load vars for managed_node3 13273 1726853327.59654: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853327.59665: Calling all_plugins_play to load vars for managed_node3 13273 1726853327.59668: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853327.59787: Calling groups_plugins_play to load vars for managed_node3 13273 1726853327.60307: done sending task result for task 02083763-bbaf-5fc3-657d-000000000917 13273 1726853327.60312: WORKER PROCESS EXITING 13273 1726853327.63076: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853327.66499: done with get_vars() 13273 1726853327.66535: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 13:28:47 -0400 (0:00:00.125) 0:00:45.556 ****** 13273 1726853327.66757: entering _queue_task() for managed_node3/service_facts 13273 1726853327.67505: worker is 1 (out of 1 available) 13273 1726853327.67516: exiting _queue_task() for managed_node3/service_facts 13273 1726853327.67527: done queuing things up, now waiting for results queue to drain 13273 1726853327.67528: waiting for pending results... 13273 1726853327.67853: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running 13273 1726853327.68267: in run() - task 02083763-bbaf-5fc3-657d-000000000919 13273 1726853327.68474: variable 'ansible_search_path' from source: unknown 13273 1726853327.68478: variable 'ansible_search_path' from source: unknown 13273 1726853327.68481: calling self._execute() 13273 1726853327.68581: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853327.68655: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853327.68669: variable 'omit' from source: magic vars 13273 1726853327.69980: variable 'ansible_distribution_major_version' from source: facts 13273 1726853327.69984: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853327.69986: variable 'omit' from source: magic vars 13273 1726853327.69988: variable 'omit' from source: magic vars 13273 1726853327.70310: variable 'omit' from source: magic vars 13273 1726853327.70359: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13273 1726853327.70402: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13273 1726853327.70677: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13273 1726853327.70681: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853327.70684: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853327.70687: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13273 1726853327.70690: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853327.70692: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853327.70778: Set connection var ansible_connection to ssh 13273 1726853327.71078: Set connection var ansible_module_compression to ZIP_DEFLATED 13273 1726853327.71081: Set connection var ansible_shell_executable to /bin/sh 13273 1726853327.71084: Set connection var ansible_shell_type to sh 13273 1726853327.71086: Set connection var ansible_pipelining to False 13273 1726853327.71088: Set connection var ansible_timeout to 10 13273 1726853327.71090: variable 'ansible_shell_executable' from source: unknown 13273 1726853327.71093: variable 'ansible_connection' from source: unknown 13273 1726853327.71095: variable 'ansible_module_compression' from source: unknown 13273 1726853327.71097: variable 'ansible_shell_type' from source: unknown 13273 1726853327.71099: variable 'ansible_shell_executable' from source: unknown 13273 1726853327.71101: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853327.71103: variable 'ansible_pipelining' from source: unknown 13273 1726853327.71105: variable 'ansible_timeout' from source: unknown 13273 1726853327.71107: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853327.71316: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 13273 1726853327.71326: variable 'omit' from source: magic vars 13273 1726853327.71341: starting attempt loop 13273 1726853327.71346: running the handler 13273 1726853327.71364: _low_level_execute_command(): starting 13273 1726853327.71374: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13273 1726853327.72116: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13273 1726853327.72177: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853327.72235: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853327.72246: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853327.72266: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853327.72358: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853327.74279: stdout chunk (state=3): >>>/root <<< 13273 1726853327.74283: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853327.74286: stdout chunk (state=3): >>><<< 13273 1726853327.74288: stderr chunk (state=3): >>><<< 13273 1726853327.74291: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853327.74293: _low_level_execute_command(): starting 13273 1726853327.74296: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853327.7427568-15363-26263928907943 `" && echo ansible-tmp-1726853327.7427568-15363-26263928907943="` echo /root/.ansible/tmp/ansible-tmp-1726853327.7427568-15363-26263928907943 `" ) && sleep 0' 13273 1726853327.75076: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13273 1726853327.75080: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853327.75083: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853327.75086: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13273 1726853327.75089: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 13273 1726853327.75091: stderr chunk (state=3): >>>debug2: match not found <<< 13273 1726853327.75102: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853327.75108: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13273 1726853327.75110: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address <<< 13273 1726853327.75113: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13273 1726853327.75114: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853327.75116: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853327.75120: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853327.75123: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853327.75126: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853327.75128: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853327.75202: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853327.77219: stdout chunk (state=3): >>>ansible-tmp-1726853327.7427568-15363-26263928907943=/root/.ansible/tmp/ansible-tmp-1726853327.7427568-15363-26263928907943 <<< 13273 1726853327.77387: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853327.77391: stdout chunk (state=3): >>><<< 13273 1726853327.77404: stderr chunk (state=3): >>><<< 13273 1726853327.77498: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853327.7427568-15363-26263928907943=/root/.ansible/tmp/ansible-tmp-1726853327.7427568-15363-26263928907943 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853327.77502: variable 'ansible_module_compression' from source: unknown 13273 1726853327.77676: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13273pukop8ph/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 13273 1726853327.77679: variable 'ansible_facts' from source: unknown 13273 1726853327.77682: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853327.7427568-15363-26263928907943/AnsiballZ_service_facts.py 13273 1726853327.77819: Sending initial data 13273 1726853327.77823: Sent initial data (161 bytes) 13273 1726853327.78494: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853327.78574: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853327.78590: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853327.78609: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853327.78618: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853327.78737: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853327.80403: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13273 1726853327.80469: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13273 1726853327.80539: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13273pukop8ph/tmpo5mdmj0u /root/.ansible/tmp/ansible-tmp-1726853327.7427568-15363-26263928907943/AnsiballZ_service_facts.py <<< 13273 1726853327.80542: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853327.7427568-15363-26263928907943/AnsiballZ_service_facts.py" <<< 13273 1726853327.80592: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13273pukop8ph/tmpo5mdmj0u" to remote "/root/.ansible/tmp/ansible-tmp-1726853327.7427568-15363-26263928907943/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853327.7427568-15363-26263928907943/AnsiballZ_service_facts.py" <<< 13273 1726853327.81539: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853327.81677: stderr chunk (state=3): >>><<< 13273 1726853327.81681: stdout chunk (state=3): >>><<< 13273 1726853327.81683: done transferring module to remote 13273 1726853327.81685: _low_level_execute_command(): starting 13273 1726853327.81687: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853327.7427568-15363-26263928907943/ /root/.ansible/tmp/ansible-tmp-1726853327.7427568-15363-26263928907943/AnsiballZ_service_facts.py && sleep 0' 13273 1726853327.82301: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 13273 1726853327.82376: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853327.82404: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853327.82502: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853327.84487: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853327.84490: stderr chunk (state=3): >>><<< 13273 1726853327.84492: stdout chunk (state=3): >>><<< 13273 1726853327.84495: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853327.84497: _low_level_execute_command(): starting 13273 1726853327.84500: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853327.7427568-15363-26263928907943/AnsiballZ_service_facts.py && sleep 0' 13273 1726853327.85091: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13273 1726853327.85276: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853327.85280: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853327.85282: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853327.85333: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853329.45756: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 13273 1726853329.47482: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 13273 1726853329.47494: stdout chunk (state=3): >>><<< 13273 1726853329.47509: stderr chunk (state=3): >>><<< 13273 1726853329.47602: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 13273 1726853329.49079: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853327.7427568-15363-26263928907943/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13273 1726853329.49101: _low_level_execute_command(): starting 13273 1726853329.49116: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853327.7427568-15363-26263928907943/ > /dev/null 2>&1 && sleep 0' 13273 1726853329.49769: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853329.49824: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853329.49840: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853329.49869: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853329.50079: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853329.51940: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853329.51961: stdout chunk (state=3): >>><<< 13273 1726853329.51983: stderr chunk (state=3): >>><<< 13273 1726853329.52184: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853329.52188: handler run complete 13273 1726853329.52288: variable 'ansible_facts' from source: unknown 13273 1726853329.52576: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853329.53282: variable 'ansible_facts' from source: unknown 13273 1726853329.53430: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853329.53731: attempt loop complete, returning result 13273 1726853329.53734: _execute() done 13273 1726853329.53736: dumping result to json 13273 1726853329.53738: done dumping result, returning 13273 1726853329.53747: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running [02083763-bbaf-5fc3-657d-000000000919] 13273 1726853329.53757: sending task result for task 02083763-bbaf-5fc3-657d-000000000919 13273 1726853329.54876: done sending task result for task 02083763-bbaf-5fc3-657d-000000000919 13273 1726853329.54883: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 13273 1726853329.54959: no more pending results, returning what we have 13273 1726853329.54962: results queue empty 13273 1726853329.54963: checking for any_errors_fatal 13273 1726853329.54966: done checking for any_errors_fatal 13273 1726853329.54967: checking for max_fail_percentage 13273 1726853329.54968: done checking for max_fail_percentage 13273 1726853329.54969: checking to see if all hosts have failed and the running result is not ok 13273 1726853329.54970: done checking to see if all hosts have failed 13273 1726853329.54972: getting the remaining hosts for this loop 13273 1726853329.54973: done getting the remaining hosts for this loop 13273 1726853329.54976: getting the next task for host managed_node3 13273 1726853329.54983: done getting next task for host managed_node3 13273 1726853329.54986: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 13273 1726853329.55086: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13273 1726853329.55104: getting variables 13273 1726853329.55106: in VariableManager get_vars() 13273 1726853329.55153: Calling all_inventory to load vars for managed_node3 13273 1726853329.55156: Calling groups_inventory to load vars for managed_node3 13273 1726853329.55158: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853329.55167: Calling all_plugins_play to load vars for managed_node3 13273 1726853329.55172: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853329.55176: Calling groups_plugins_play to load vars for managed_node3 13273 1726853329.57397: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853329.60207: done with get_vars() 13273 1726853329.60239: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 13:28:49 -0400 (0:00:01.937) 0:00:47.494 ****** 13273 1726853329.60556: entering _queue_task() for managed_node3/package_facts 13273 1726853329.61332: worker is 1 (out of 1 available) 13273 1726853329.61348: exiting _queue_task() for managed_node3/package_facts 13273 1726853329.61362: done queuing things up, now waiting for results queue to drain 13273 1726853329.61363: waiting for pending results... 13273 1726853329.61991: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed 13273 1726853329.62383: in run() - task 02083763-bbaf-5fc3-657d-00000000091a 13273 1726853329.62387: variable 'ansible_search_path' from source: unknown 13273 1726853329.62390: variable 'ansible_search_path' from source: unknown 13273 1726853329.62393: calling self._execute() 13273 1726853329.62477: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853329.62610: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853329.62626: variable 'omit' from source: magic vars 13273 1726853329.63338: variable 'ansible_distribution_major_version' from source: facts 13273 1726853329.63391: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853329.63403: variable 'omit' from source: magic vars 13273 1726853329.63493: variable 'omit' from source: magic vars 13273 1726853329.63781: variable 'omit' from source: magic vars 13273 1726853329.63784: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13273 1726853329.64075: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13273 1726853329.64078: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13273 1726853329.64080: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853329.64082: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853329.64084: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13273 1726853329.64086: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853329.64087: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853329.64175: Set connection var ansible_connection to ssh 13273 1726853329.64191: Set connection var ansible_module_compression to ZIP_DEFLATED 13273 1726853329.64286: Set connection var ansible_shell_executable to /bin/sh 13273 1726853329.64295: Set connection var ansible_shell_type to sh 13273 1726853329.64306: Set connection var ansible_pipelining to False 13273 1726853329.64315: Set connection var ansible_timeout to 10 13273 1726853329.64344: variable 'ansible_shell_executable' from source: unknown 13273 1726853329.64352: variable 'ansible_connection' from source: unknown 13273 1726853329.64359: variable 'ansible_module_compression' from source: unknown 13273 1726853329.64365: variable 'ansible_shell_type' from source: unknown 13273 1726853329.64373: variable 'ansible_shell_executable' from source: unknown 13273 1726853329.64380: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853329.64388: variable 'ansible_pipelining' from source: unknown 13273 1726853329.64676: variable 'ansible_timeout' from source: unknown 13273 1726853329.64679: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853329.64878: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 13273 1726853329.64898: variable 'omit' from source: magic vars 13273 1726853329.64908: starting attempt loop 13273 1726853329.64919: running the handler 13273 1726853329.64940: _low_level_execute_command(): starting 13273 1726853329.64954: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13273 1726853329.66341: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853329.66495: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853329.66705: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853329.66789: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853329.68497: stdout chunk (state=3): >>>/root <<< 13273 1726853329.68592: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853329.68623: stderr chunk (state=3): >>><<< 13273 1726853329.68634: stdout chunk (state=3): >>><<< 13273 1726853329.68894: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853329.68898: _low_level_execute_command(): starting 13273 1726853329.68901: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853329.688004-15454-174285473109655 `" && echo ansible-tmp-1726853329.688004-15454-174285473109655="` echo /root/.ansible/tmp/ansible-tmp-1726853329.688004-15454-174285473109655 `" ) && sleep 0' 13273 1726853329.69516: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13273 1726853329.69531: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853329.69555: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13273 1726853329.69627: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853329.69651: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853329.69718: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853329.71851: stdout chunk (state=3): >>>ansible-tmp-1726853329.688004-15454-174285473109655=/root/.ansible/tmp/ansible-tmp-1726853329.688004-15454-174285473109655 <<< 13273 1726853329.72277: stdout chunk (state=3): >>><<< 13273 1726853329.72281: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853329.72283: stderr chunk (state=3): >>><<< 13273 1726853329.72286: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853329.688004-15454-174285473109655=/root/.ansible/tmp/ansible-tmp-1726853329.688004-15454-174285473109655 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853329.72289: variable 'ansible_module_compression' from source: unknown 13273 1726853329.72291: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13273pukop8ph/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 13273 1726853329.72300: variable 'ansible_facts' from source: unknown 13273 1726853329.72499: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853329.688004-15454-174285473109655/AnsiballZ_package_facts.py 13273 1726853329.72718: Sending initial data 13273 1726853329.72727: Sent initial data (161 bytes) 13273 1726853329.73202: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13273 1726853329.73215: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853329.73227: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853329.73241: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13273 1726853329.73287: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853329.73346: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853329.73370: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853329.73404: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853329.73490: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853329.75165: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13273 1726853329.75248: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13273 1726853329.75310: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13273pukop8ph/tmpzuo7zgto /root/.ansible/tmp/ansible-tmp-1726853329.688004-15454-174285473109655/AnsiballZ_package_facts.py <<< 13273 1726853329.75326: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853329.688004-15454-174285473109655/AnsiballZ_package_facts.py" <<< 13273 1726853329.75368: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13273pukop8ph/tmpzuo7zgto" to remote "/root/.ansible/tmp/ansible-tmp-1726853329.688004-15454-174285473109655/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853329.688004-15454-174285473109655/AnsiballZ_package_facts.py" <<< 13273 1726853329.76893: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853329.76931: stderr chunk (state=3): >>><<< 13273 1726853329.76958: stdout chunk (state=3): >>><<< 13273 1726853329.77018: done transferring module to remote 13273 1726853329.77102: _low_level_execute_command(): starting 13273 1726853329.77105: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853329.688004-15454-174285473109655/ /root/.ansible/tmp/ansible-tmp-1726853329.688004-15454-174285473109655/AnsiballZ_package_facts.py && sleep 0' 13273 1726853329.77642: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13273 1726853329.77676: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853329.77690: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13273 1726853329.77784: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853329.77808: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853329.77833: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853329.77849: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853329.77934: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853329.79858: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853329.80074: stdout chunk (state=3): >>><<< 13273 1726853329.80078: stderr chunk (state=3): >>><<< 13273 1726853329.80085: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853329.80089: _low_level_execute_command(): starting 13273 1726853329.80091: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853329.688004-15454-174285473109655/AnsiballZ_package_facts.py && sleep 0' 13273 1726853329.80708: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13273 1726853329.80711: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853329.80714: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853329.80716: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13273 1726853329.80718: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 13273 1726853329.80752: stderr chunk (state=3): >>>debug2: match not found <<< 13273 1726853329.80755: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853329.80848: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13273 1726853329.80852: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853329.80900: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853329.80974: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853330.25959: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "sou<<< 13273 1726853330.26132: stdout chunk (state=3): >>>rce": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.<<< 13273 1726853330.26145: stdout chunk (state=3): >>>26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 13273 1726853330.28174: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 13273 1726853330.28178: stdout chunk (state=3): >>><<< 13273 1726853330.28181: stderr chunk (state=3): >>><<< 13273 1726853330.28484: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 13273 1726853330.31907: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853329.688004-15454-174285473109655/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13273 1726853330.31938: _low_level_execute_command(): starting 13273 1726853330.31949: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853329.688004-15454-174285473109655/ > /dev/null 2>&1 && sleep 0' 13273 1726853330.32604: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13273 1726853330.32625: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853330.32642: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853330.32688: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 13273 1726853330.32706: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853330.32796: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853330.32844: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853330.33000: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853330.35178: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853330.35181: stdout chunk (state=3): >>><<< 13273 1726853330.35184: stderr chunk (state=3): >>><<< 13273 1726853330.35186: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853330.35188: handler run complete 13273 1726853330.36872: variable 'ansible_facts' from source: unknown 13273 1726853330.37472: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853330.40566: variable 'ansible_facts' from source: unknown 13273 1726853330.41738: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853330.43328: attempt loop complete, returning result 13273 1726853330.43347: _execute() done 13273 1726853330.43577: dumping result to json 13273 1726853330.43878: done dumping result, returning 13273 1726853330.43893: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed [02083763-bbaf-5fc3-657d-00000000091a] 13273 1726853330.44009: sending task result for task 02083763-bbaf-5fc3-657d-00000000091a 13273 1726853330.48604: done sending task result for task 02083763-bbaf-5fc3-657d-00000000091a 13273 1726853330.48608: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 13273 1726853330.48762: no more pending results, returning what we have 13273 1726853330.48765: results queue empty 13273 1726853330.48766: checking for any_errors_fatal 13273 1726853330.48772: done checking for any_errors_fatal 13273 1726853330.48773: checking for max_fail_percentage 13273 1726853330.48775: done checking for max_fail_percentage 13273 1726853330.48775: checking to see if all hosts have failed and the running result is not ok 13273 1726853330.48776: done checking to see if all hosts have failed 13273 1726853330.48777: getting the remaining hosts for this loop 13273 1726853330.48778: done getting the remaining hosts for this loop 13273 1726853330.48781: getting the next task for host managed_node3 13273 1726853330.48787: done getting next task for host managed_node3 13273 1726853330.48791: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 13273 1726853330.48795: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13273 1726853330.48806: getting variables 13273 1726853330.48808: in VariableManager get_vars() 13273 1726853330.48850: Calling all_inventory to load vars for managed_node3 13273 1726853330.48852: Calling groups_inventory to load vars for managed_node3 13273 1726853330.48855: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853330.48863: Calling all_plugins_play to load vars for managed_node3 13273 1726853330.48866: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853330.48869: Calling groups_plugins_play to load vars for managed_node3 13273 1726853330.51563: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853330.54707: done with get_vars() 13273 1726853330.54742: done getting variables 13273 1726853330.54814: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 13:28:50 -0400 (0:00:00.943) 0:00:48.437 ****** 13273 1726853330.54861: entering _queue_task() for managed_node3/debug 13273 1726853330.55628: worker is 1 (out of 1 available) 13273 1726853330.55643: exiting _queue_task() for managed_node3/debug 13273 1726853330.55659: done queuing things up, now waiting for results queue to drain 13273 1726853330.55660: waiting for pending results... 13273 1726853330.56289: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider 13273 1726853330.56419: in run() - task 02083763-bbaf-5fc3-657d-00000000016d 13273 1726853330.56442: variable 'ansible_search_path' from source: unknown 13273 1726853330.56578: variable 'ansible_search_path' from source: unknown 13273 1726853330.56582: calling self._execute() 13273 1726853330.56784: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853330.56797: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853330.56812: variable 'omit' from source: magic vars 13273 1726853330.57635: variable 'ansible_distribution_major_version' from source: facts 13273 1726853330.57653: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853330.57666: variable 'omit' from source: magic vars 13273 1726853330.57877: variable 'omit' from source: magic vars 13273 1726853330.58149: variable 'network_provider' from source: set_fact 13273 1726853330.58152: variable 'omit' from source: magic vars 13273 1726853330.58155: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13273 1726853330.58290: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13273 1726853330.58315: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13273 1726853330.58337: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853330.58384: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853330.58578: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13273 1726853330.58584: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853330.58588: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853330.58646: Set connection var ansible_connection to ssh 13273 1726853330.58715: Set connection var ansible_module_compression to ZIP_DEFLATED 13273 1726853330.58725: Set connection var ansible_shell_executable to /bin/sh 13273 1726853330.58920: Set connection var ansible_shell_type to sh 13273 1726853330.58923: Set connection var ansible_pipelining to False 13273 1726853330.58926: Set connection var ansible_timeout to 10 13273 1726853330.58928: variable 'ansible_shell_executable' from source: unknown 13273 1726853330.58931: variable 'ansible_connection' from source: unknown 13273 1726853330.58933: variable 'ansible_module_compression' from source: unknown 13273 1726853330.58935: variable 'ansible_shell_type' from source: unknown 13273 1726853330.58937: variable 'ansible_shell_executable' from source: unknown 13273 1726853330.58939: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853330.58941: variable 'ansible_pipelining' from source: unknown 13273 1726853330.58943: variable 'ansible_timeout' from source: unknown 13273 1726853330.58945: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853330.59194: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13273 1726853330.59211: variable 'omit' from source: magic vars 13273 1726853330.59255: starting attempt loop 13273 1726853330.59262: running the handler 13273 1726853330.59464: handler run complete 13273 1726853330.59468: attempt loop complete, returning result 13273 1726853330.59470: _execute() done 13273 1726853330.59474: dumping result to json 13273 1726853330.59476: done dumping result, returning 13273 1726853330.59479: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider [02083763-bbaf-5fc3-657d-00000000016d] 13273 1726853330.59481: sending task result for task 02083763-bbaf-5fc3-657d-00000000016d ok: [managed_node3] => {} MSG: Using network provider: nm 13273 1726853330.59640: no more pending results, returning what we have 13273 1726853330.59644: results queue empty 13273 1726853330.59645: checking for any_errors_fatal 13273 1726853330.59658: done checking for any_errors_fatal 13273 1726853330.59659: checking for max_fail_percentage 13273 1726853330.59660: done checking for max_fail_percentage 13273 1726853330.59661: checking to see if all hosts have failed and the running result is not ok 13273 1726853330.59662: done checking to see if all hosts have failed 13273 1726853330.59662: getting the remaining hosts for this loop 13273 1726853330.59664: done getting the remaining hosts for this loop 13273 1726853330.59667: getting the next task for host managed_node3 13273 1726853330.59678: done getting next task for host managed_node3 13273 1726853330.59682: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 13273 1726853330.59687: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13273 1726853330.59702: getting variables 13273 1726853330.59704: in VariableManager get_vars() 13273 1726853330.59754: Calling all_inventory to load vars for managed_node3 13273 1726853330.59757: Calling groups_inventory to load vars for managed_node3 13273 1726853330.59759: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853330.59768: Calling all_plugins_play to load vars for managed_node3 13273 1726853330.60074: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853330.60081: Calling groups_plugins_play to load vars for managed_node3 13273 1726853330.60787: done sending task result for task 02083763-bbaf-5fc3-657d-00000000016d 13273 1726853330.60790: WORKER PROCESS EXITING 13273 1726853330.62630: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853330.65849: done with get_vars() 13273 1726853330.66087: done getting variables 13273 1726853330.66152: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 13:28:50 -0400 (0:00:00.113) 0:00:48.551 ****** 13273 1726853330.66193: entering _queue_task() for managed_node3/fail 13273 1726853330.66975: worker is 1 (out of 1 available) 13273 1726853330.66987: exiting _queue_task() for managed_node3/fail 13273 1726853330.67000: done queuing things up, now waiting for results queue to drain 13273 1726853330.67001: waiting for pending results... 13273 1726853330.67693: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 13273 1726853330.67853: in run() - task 02083763-bbaf-5fc3-657d-00000000016e 13273 1726853330.67879: variable 'ansible_search_path' from source: unknown 13273 1726853330.67893: variable 'ansible_search_path' from source: unknown 13273 1726853330.68040: calling self._execute() 13273 1726853330.68235: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853330.68253: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853330.68335: variable 'omit' from source: magic vars 13273 1726853330.69199: variable 'ansible_distribution_major_version' from source: facts 13273 1726853330.69220: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853330.69477: variable 'network_state' from source: role '' defaults 13273 1726853330.69481: Evaluated conditional (network_state != {}): False 13273 1726853330.69484: when evaluation is False, skipping this task 13273 1726853330.69487: _execute() done 13273 1726853330.69490: dumping result to json 13273 1726853330.69527: done dumping result, returning 13273 1726853330.69676: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [02083763-bbaf-5fc3-657d-00000000016e] 13273 1726853330.69681: sending task result for task 02083763-bbaf-5fc3-657d-00000000016e skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 13273 1726853330.69820: no more pending results, returning what we have 13273 1726853330.69827: results queue empty 13273 1726853330.69828: checking for any_errors_fatal 13273 1726853330.69835: done checking for any_errors_fatal 13273 1726853330.69836: checking for max_fail_percentage 13273 1726853330.69838: done checking for max_fail_percentage 13273 1726853330.69839: checking to see if all hosts have failed and the running result is not ok 13273 1726853330.69840: done checking to see if all hosts have failed 13273 1726853330.69841: getting the remaining hosts for this loop 13273 1726853330.69842: done getting the remaining hosts for this loop 13273 1726853330.69848: getting the next task for host managed_node3 13273 1726853330.69857: done getting next task for host managed_node3 13273 1726853330.69862: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 13273 1726853330.69867: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13273 1726853330.69898: getting variables 13273 1726853330.69900: in VariableManager get_vars() 13273 1726853330.69963: Calling all_inventory to load vars for managed_node3 13273 1726853330.69967: Calling groups_inventory to load vars for managed_node3 13273 1726853330.69970: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853330.70285: Calling all_plugins_play to load vars for managed_node3 13273 1726853330.70289: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853330.70292: Calling groups_plugins_play to load vars for managed_node3 13273 1726853330.71178: done sending task result for task 02083763-bbaf-5fc3-657d-00000000016e 13273 1726853330.71182: WORKER PROCESS EXITING 13273 1726853330.73302: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853330.76491: done with get_vars() 13273 1726853330.76526: done getting variables 13273 1726853330.76799: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 13:28:50 -0400 (0:00:00.106) 0:00:48.657 ****** 13273 1726853330.76838: entering _queue_task() for managed_node3/fail 13273 1726853330.77643: worker is 1 (out of 1 available) 13273 1726853330.77659: exiting _queue_task() for managed_node3/fail 13273 1726853330.77669: done queuing things up, now waiting for results queue to drain 13273 1726853330.77670: waiting for pending results... 13273 1726853330.78156: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 13273 1726853330.78467: in run() - task 02083763-bbaf-5fc3-657d-00000000016f 13273 1726853330.78529: variable 'ansible_search_path' from source: unknown 13273 1726853330.78877: variable 'ansible_search_path' from source: unknown 13273 1726853330.78881: calling self._execute() 13273 1726853330.78950: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853330.78965: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853330.78985: variable 'omit' from source: magic vars 13273 1726853330.79820: variable 'ansible_distribution_major_version' from source: facts 13273 1726853330.79840: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853330.80090: variable 'network_state' from source: role '' defaults 13273 1726853330.80108: Evaluated conditional (network_state != {}): False 13273 1726853330.80181: when evaluation is False, skipping this task 13273 1726853330.80195: _execute() done 13273 1726853330.80204: dumping result to json 13273 1726853330.80212: done dumping result, returning 13273 1726853330.80225: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [02083763-bbaf-5fc3-657d-00000000016f] 13273 1726853330.80237: sending task result for task 02083763-bbaf-5fc3-657d-00000000016f skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 13273 1726853330.80403: no more pending results, returning what we have 13273 1726853330.80408: results queue empty 13273 1726853330.80409: checking for any_errors_fatal 13273 1726853330.80419: done checking for any_errors_fatal 13273 1726853330.80420: checking for max_fail_percentage 13273 1726853330.80422: done checking for max_fail_percentage 13273 1726853330.80423: checking to see if all hosts have failed and the running result is not ok 13273 1726853330.80423: done checking to see if all hosts have failed 13273 1726853330.80424: getting the remaining hosts for this loop 13273 1726853330.80426: done getting the remaining hosts for this loop 13273 1726853330.80429: getting the next task for host managed_node3 13273 1726853330.80438: done getting next task for host managed_node3 13273 1726853330.80443: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 13273 1726853330.80451: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13273 1726853330.80482: getting variables 13273 1726853330.80484: in VariableManager get_vars() 13273 1726853330.80541: Calling all_inventory to load vars for managed_node3 13273 1726853330.80544: Calling groups_inventory to load vars for managed_node3 13273 1726853330.80550: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853330.80562: Calling all_plugins_play to load vars for managed_node3 13273 1726853330.80565: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853330.80569: Calling groups_plugins_play to load vars for managed_node3 13273 1726853330.81877: done sending task result for task 02083763-bbaf-5fc3-657d-00000000016f 13273 1726853330.81880: WORKER PROCESS EXITING 13273 1726853330.83605: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853330.86781: done with get_vars() 13273 1726853330.86813: done getting variables 13273 1726853330.87081: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 13:28:50 -0400 (0:00:00.102) 0:00:48.760 ****** 13273 1726853330.87118: entering _queue_task() for managed_node3/fail 13273 1726853330.87701: worker is 1 (out of 1 available) 13273 1726853330.87717: exiting _queue_task() for managed_node3/fail 13273 1726853330.87730: done queuing things up, now waiting for results queue to drain 13273 1726853330.87731: waiting for pending results... 13273 1726853330.88614: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 13273 1726853330.88921: in run() - task 02083763-bbaf-5fc3-657d-000000000170 13273 1726853330.88940: variable 'ansible_search_path' from source: unknown 13273 1726853330.89378: variable 'ansible_search_path' from source: unknown 13273 1726853330.89383: calling self._execute() 13273 1726853330.89387: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853330.89390: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853330.89394: variable 'omit' from source: magic vars 13273 1726853330.89922: variable 'ansible_distribution_major_version' from source: facts 13273 1726853330.89945: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853330.90121: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13273 1726853330.98280: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13273 1726853330.98349: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13273 1726853330.98676: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13273 1726853330.98680: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13273 1726853330.98683: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13273 1726853330.98740: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853330.98774: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853330.98803: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853330.99075: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853330.99078: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853330.99122: variable 'ansible_distribution_major_version' from source: facts 13273 1726853330.99141: Evaluated conditional (ansible_distribution_major_version | int > 9): True 13273 1726853330.99249: variable 'ansible_distribution' from source: facts 13273 1726853330.99677: variable '__network_rh_distros' from source: role '' defaults 13273 1726853330.99681: Evaluated conditional (ansible_distribution in __network_rh_distros): True 13273 1726853330.99913: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853330.99943: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853330.99975: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853331.00026: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853331.00046: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853331.00104: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853331.00131: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853331.00160: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853331.00209: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853331.00230: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853331.00276: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853331.00307: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853331.00341: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853331.00386: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853331.00406: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853331.00724: variable 'network_connections' from source: task vars 13273 1726853331.00740: variable 'controller_profile' from source: play vars 13273 1726853331.00813: variable 'controller_profile' from source: play vars 13273 1726853331.00830: variable 'network_state' from source: role '' defaults 13273 1726853331.00902: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13273 1726853331.01092: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13273 1726853331.01132: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13273 1726853331.01166: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13273 1726853331.01206: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13273 1726853331.01252: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13273 1726853331.01280: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13273 1726853331.01323: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853331.01353: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13273 1726853331.01382: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 13273 1726853331.01391: when evaluation is False, skipping this task 13273 1726853331.01398: _execute() done 13273 1726853331.01407: dumping result to json 13273 1726853331.01419: done dumping result, returning 13273 1726853331.01433: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [02083763-bbaf-5fc3-657d-000000000170] 13273 1726853331.01441: sending task result for task 02083763-bbaf-5fc3-657d-000000000170 skipping: [managed_node3] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 13273 1726853331.01645: no more pending results, returning what we have 13273 1726853331.01648: results queue empty 13273 1726853331.01649: checking for any_errors_fatal 13273 1726853331.01655: done checking for any_errors_fatal 13273 1726853331.01656: checking for max_fail_percentage 13273 1726853331.01658: done checking for max_fail_percentage 13273 1726853331.01659: checking to see if all hosts have failed and the running result is not ok 13273 1726853331.01660: done checking to see if all hosts have failed 13273 1726853331.01660: getting the remaining hosts for this loop 13273 1726853331.01662: done getting the remaining hosts for this loop 13273 1726853331.01666: getting the next task for host managed_node3 13273 1726853331.01674: done getting next task for host managed_node3 13273 1726853331.01679: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 13273 1726853331.01683: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13273 1726853331.01706: getting variables 13273 1726853331.01707: in VariableManager get_vars() 13273 1726853331.01755: Calling all_inventory to load vars for managed_node3 13273 1726853331.01758: Calling groups_inventory to load vars for managed_node3 13273 1726853331.01761: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853331.01770: Calling all_plugins_play to load vars for managed_node3 13273 1726853331.01780: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853331.01784: Calling groups_plugins_play to load vars for managed_node3 13273 1726853331.02395: done sending task result for task 02083763-bbaf-5fc3-657d-000000000170 13273 1726853331.02399: WORKER PROCESS EXITING 13273 1726853331.09611: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853331.11181: done with get_vars() 13273 1726853331.11207: done getting variables 13273 1726853331.11261: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 13:28:51 -0400 (0:00:00.241) 0:00:49.002 ****** 13273 1726853331.11293: entering _queue_task() for managed_node3/dnf 13273 1726853331.11661: worker is 1 (out of 1 available) 13273 1726853331.11777: exiting _queue_task() for managed_node3/dnf 13273 1726853331.11789: done queuing things up, now waiting for results queue to drain 13273 1726853331.11790: waiting for pending results... 13273 1726853331.11995: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 13273 1726853331.12164: in run() - task 02083763-bbaf-5fc3-657d-000000000171 13273 1726853331.12187: variable 'ansible_search_path' from source: unknown 13273 1726853331.12195: variable 'ansible_search_path' from source: unknown 13273 1726853331.12244: calling self._execute() 13273 1726853331.12362: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853331.12377: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853331.12391: variable 'omit' from source: magic vars 13273 1726853331.12788: variable 'ansible_distribution_major_version' from source: facts 13273 1726853331.12805: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853331.13077: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13273 1726853331.15287: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13273 1726853331.15363: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13273 1726853331.15414: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13273 1726853331.15458: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13273 1726853331.15493: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13273 1726853331.15575: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853331.15616: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853331.15678: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853331.15693: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853331.15713: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853331.16077: variable 'ansible_distribution' from source: facts 13273 1726853331.16081: variable 'ansible_distribution_major_version' from source: facts 13273 1726853331.16084: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 13273 1726853331.16086: variable '__network_wireless_connections_defined' from source: role '' defaults 13273 1726853331.16113: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853331.16142: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853331.16170: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853331.16223: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853331.16243: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853331.16288: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853331.16323: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853331.16350: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853331.16393: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853331.16417: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853331.16463: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853331.16494: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853331.16528: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853331.16638: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853331.16642: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853331.16856: variable 'network_connections' from source: task vars 13273 1726853331.16863: variable 'controller_profile' from source: play vars 13273 1726853331.16865: variable 'controller_profile' from source: play vars 13273 1726853331.16917: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13273 1726853331.17096: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13273 1726853331.17132: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13273 1726853331.17161: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13273 1726853331.17578: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13273 1726853331.17583: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13273 1726853331.17586: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13273 1726853331.17599: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853331.17876: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13273 1726853331.17880: variable '__network_team_connections_defined' from source: role '' defaults 13273 1726853331.18234: variable 'network_connections' from source: task vars 13273 1726853331.18477: variable 'controller_profile' from source: play vars 13273 1726853331.18480: variable 'controller_profile' from source: play vars 13273 1726853331.18483: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 13273 1726853331.18485: when evaluation is False, skipping this task 13273 1726853331.18487: _execute() done 13273 1726853331.18490: dumping result to json 13273 1726853331.18492: done dumping result, returning 13273 1726853331.18585: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [02083763-bbaf-5fc3-657d-000000000171] 13273 1726853331.18694: sending task result for task 02083763-bbaf-5fc3-657d-000000000171 13273 1726853331.18769: done sending task result for task 02083763-bbaf-5fc3-657d-000000000171 13273 1726853331.18775: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 13273 1726853331.18851: no more pending results, returning what we have 13273 1726853331.18855: results queue empty 13273 1726853331.18857: checking for any_errors_fatal 13273 1726853331.18867: done checking for any_errors_fatal 13273 1726853331.18868: checking for max_fail_percentage 13273 1726853331.18870: done checking for max_fail_percentage 13273 1726853331.18872: checking to see if all hosts have failed and the running result is not ok 13273 1726853331.18873: done checking to see if all hosts have failed 13273 1726853331.18874: getting the remaining hosts for this loop 13273 1726853331.18876: done getting the remaining hosts for this loop 13273 1726853331.18879: getting the next task for host managed_node3 13273 1726853331.18887: done getting next task for host managed_node3 13273 1726853331.18891: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 13273 1726853331.18895: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13273 1726853331.18920: getting variables 13273 1726853331.18922: in VariableManager get_vars() 13273 1726853331.19283: Calling all_inventory to load vars for managed_node3 13273 1726853331.19286: Calling groups_inventory to load vars for managed_node3 13273 1726853331.19289: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853331.19298: Calling all_plugins_play to load vars for managed_node3 13273 1726853331.19302: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853331.19305: Calling groups_plugins_play to load vars for managed_node3 13273 1726853331.20994: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853331.22863: done with get_vars() 13273 1726853331.22890: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 13273 1726853331.22964: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 13:28:51 -0400 (0:00:00.117) 0:00:49.119 ****** 13273 1726853331.23003: entering _queue_task() for managed_node3/yum 13273 1726853331.23403: worker is 1 (out of 1 available) 13273 1726853331.23420: exiting _queue_task() for managed_node3/yum 13273 1726853331.23433: done queuing things up, now waiting for results queue to drain 13273 1726853331.23434: waiting for pending results... 13273 1726853331.23673: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 13273 1726853331.23842: in run() - task 02083763-bbaf-5fc3-657d-000000000172 13273 1726853331.23868: variable 'ansible_search_path' from source: unknown 13273 1726853331.23881: variable 'ansible_search_path' from source: unknown 13273 1726853331.23925: calling self._execute() 13273 1726853331.24037: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853331.24080: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853331.24083: variable 'omit' from source: magic vars 13273 1726853331.24444: variable 'ansible_distribution_major_version' from source: facts 13273 1726853331.24461: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853331.24637: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13273 1726853331.26905: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13273 1726853331.26930: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13273 1726853331.26974: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13273 1726853331.27023: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13273 1726853331.27121: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13273 1726853331.27141: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853331.27178: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853331.27209: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853331.27262: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853331.27284: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853331.27449: variable 'ansible_distribution_major_version' from source: facts 13273 1726853331.27453: Evaluated conditional (ansible_distribution_major_version | int < 8): False 13273 1726853331.27455: when evaluation is False, skipping this task 13273 1726853331.27457: _execute() done 13273 1726853331.27460: dumping result to json 13273 1726853331.27462: done dumping result, returning 13273 1726853331.27464: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [02083763-bbaf-5fc3-657d-000000000172] 13273 1726853331.27467: sending task result for task 02083763-bbaf-5fc3-657d-000000000172 13273 1726853331.27628: done sending task result for task 02083763-bbaf-5fc3-657d-000000000172 13273 1726853331.27631: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 13273 1726853331.27712: no more pending results, returning what we have 13273 1726853331.27716: results queue empty 13273 1726853331.27717: checking for any_errors_fatal 13273 1726853331.27722: done checking for any_errors_fatal 13273 1726853331.27723: checking for max_fail_percentage 13273 1726853331.27725: done checking for max_fail_percentage 13273 1726853331.27726: checking to see if all hosts have failed and the running result is not ok 13273 1726853331.27726: done checking to see if all hosts have failed 13273 1726853331.27727: getting the remaining hosts for this loop 13273 1726853331.27728: done getting the remaining hosts for this loop 13273 1726853331.27732: getting the next task for host managed_node3 13273 1726853331.27740: done getting next task for host managed_node3 13273 1726853331.27744: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 13273 1726853331.27748: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13273 1726853331.27773: getting variables 13273 1726853331.27774: in VariableManager get_vars() 13273 1726853331.27829: Calling all_inventory to load vars for managed_node3 13273 1726853331.27832: Calling groups_inventory to load vars for managed_node3 13273 1726853331.27834: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853331.27844: Calling all_plugins_play to load vars for managed_node3 13273 1726853331.27847: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853331.27850: Calling groups_plugins_play to load vars for managed_node3 13273 1726853331.30943: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853331.33953: done with get_vars() 13273 1726853331.34184: done getting variables 13273 1726853331.34245: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 13:28:51 -0400 (0:00:00.112) 0:00:49.231 ****** 13273 1726853331.34283: entering _queue_task() for managed_node3/fail 13273 1726853331.34830: worker is 1 (out of 1 available) 13273 1726853331.34843: exiting _queue_task() for managed_node3/fail 13273 1726853331.34857: done queuing things up, now waiting for results queue to drain 13273 1726853331.34858: waiting for pending results... 13273 1726853331.35296: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 13273 1726853331.35331: in run() - task 02083763-bbaf-5fc3-657d-000000000173 13273 1726853331.35352: variable 'ansible_search_path' from source: unknown 13273 1726853331.35360: variable 'ansible_search_path' from source: unknown 13273 1726853331.35409: calling self._execute() 13273 1726853331.35528: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853331.35541: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853331.35556: variable 'omit' from source: magic vars 13273 1726853331.35968: variable 'ansible_distribution_major_version' from source: facts 13273 1726853331.35987: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853331.36112: variable '__network_wireless_connections_defined' from source: role '' defaults 13273 1726853331.36377: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13273 1726853331.38581: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13273 1726853331.38652: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13273 1726853331.38699: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13273 1726853331.38737: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13273 1726853331.38767: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13273 1726853331.38851: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853331.38891: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853331.38921: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853331.38960: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853331.38979: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853331.39112: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853331.39115: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853331.39118: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853331.39120: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853331.39139: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853331.39184: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853331.39216: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853331.39248: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853331.39292: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853331.39309: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853331.39485: variable 'network_connections' from source: task vars 13273 1726853331.39546: variable 'controller_profile' from source: play vars 13273 1726853331.39579: variable 'controller_profile' from source: play vars 13273 1726853331.39661: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13273 1726853331.39820: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13273 1726853331.40218: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13273 1726853331.40252: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13273 1726853331.40286: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13273 1726853331.40389: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13273 1726853331.40392: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13273 1726853331.40406: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853331.40443: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13273 1726853331.40525: variable '__network_team_connections_defined' from source: role '' defaults 13273 1726853331.40754: variable 'network_connections' from source: task vars 13273 1726853331.40766: variable 'controller_profile' from source: play vars 13273 1726853331.40830: variable 'controller_profile' from source: play vars 13273 1726853331.40957: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 13273 1726853331.40961: when evaluation is False, skipping this task 13273 1726853331.40963: _execute() done 13273 1726853331.40966: dumping result to json 13273 1726853331.40968: done dumping result, returning 13273 1726853331.40970: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [02083763-bbaf-5fc3-657d-000000000173] 13273 1726853331.40975: sending task result for task 02083763-bbaf-5fc3-657d-000000000173 13273 1726853331.41053: done sending task result for task 02083763-bbaf-5fc3-657d-000000000173 13273 1726853331.41056: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 13273 1726853331.41111: no more pending results, returning what we have 13273 1726853331.41114: results queue empty 13273 1726853331.41115: checking for any_errors_fatal 13273 1726853331.41122: done checking for any_errors_fatal 13273 1726853331.41122: checking for max_fail_percentage 13273 1726853331.41124: done checking for max_fail_percentage 13273 1726853331.41125: checking to see if all hosts have failed and the running result is not ok 13273 1726853331.41125: done checking to see if all hosts have failed 13273 1726853331.41126: getting the remaining hosts for this loop 13273 1726853331.41127: done getting the remaining hosts for this loop 13273 1726853331.41130: getting the next task for host managed_node3 13273 1726853331.41137: done getting next task for host managed_node3 13273 1726853331.41141: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 13273 1726853331.41144: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13273 1726853331.41375: getting variables 13273 1726853331.41377: in VariableManager get_vars() 13273 1726853331.41426: Calling all_inventory to load vars for managed_node3 13273 1726853331.41429: Calling groups_inventory to load vars for managed_node3 13273 1726853331.41432: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853331.41441: Calling all_plugins_play to load vars for managed_node3 13273 1726853331.41444: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853331.41447: Calling groups_plugins_play to load vars for managed_node3 13273 1726853331.44224: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853331.45938: done with get_vars() 13273 1726853331.45964: done getting variables 13273 1726853331.46031: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 13:28:51 -0400 (0:00:00.117) 0:00:49.349 ****** 13273 1726853331.46074: entering _queue_task() for managed_node3/package 13273 1726853331.47001: worker is 1 (out of 1 available) 13273 1726853331.47014: exiting _queue_task() for managed_node3/package 13273 1726853331.47027: done queuing things up, now waiting for results queue to drain 13273 1726853331.47028: waiting for pending results... 13273 1726853331.47692: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages 13273 1726853331.47698: in run() - task 02083763-bbaf-5fc3-657d-000000000174 13273 1726853331.47701: variable 'ansible_search_path' from source: unknown 13273 1726853331.47703: variable 'ansible_search_path' from source: unknown 13273 1726853331.47705: calling self._execute() 13273 1726853331.47816: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853331.47829: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853331.47841: variable 'omit' from source: magic vars 13273 1726853331.48215: variable 'ansible_distribution_major_version' from source: facts 13273 1726853331.48232: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853331.48424: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13273 1726853331.48712: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13273 1726853331.48766: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13273 1726853331.48806: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13273 1726853331.48884: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13273 1726853331.49001: variable 'network_packages' from source: role '' defaults 13273 1726853331.49178: variable '__network_provider_setup' from source: role '' defaults 13273 1726853331.49181: variable '__network_service_name_default_nm' from source: role '' defaults 13273 1726853331.49200: variable '__network_service_name_default_nm' from source: role '' defaults 13273 1726853331.49214: variable '__network_packages_default_nm' from source: role '' defaults 13273 1726853331.49286: variable '__network_packages_default_nm' from source: role '' defaults 13273 1726853331.49473: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13273 1726853331.53807: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13273 1726853331.53938: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13273 1726853331.54084: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13273 1726853331.54121: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13273 1726853331.54272: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13273 1726853331.54369: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853331.54579: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853331.54583: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853331.54688: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853331.54714: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853331.54769: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853331.54941: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853331.55048: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853331.55052: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853331.55054: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853331.55366: variable '__network_packages_default_gobject_packages' from source: role '' defaults 13273 1726853331.55676: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853331.55738: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853331.55772: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853331.55868: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853331.55952: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853331.56136: variable 'ansible_python' from source: facts 13273 1726853331.56287: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 13273 1726853331.56579: variable '__network_wpa_supplicant_required' from source: role '' defaults 13273 1726853331.56582: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 13273 1726853331.56888: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853331.56923: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853331.56954: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853331.57059: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853331.57083: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853331.57174: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853331.57257: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853331.57368: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853331.57412: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853331.57474: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853331.57737: variable 'network_connections' from source: task vars 13273 1726853331.57994: variable 'controller_profile' from source: play vars 13273 1726853331.58103: variable 'controller_profile' from source: play vars 13273 1726853331.58162: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13273 1726853331.58248: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13273 1726853331.58285: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853331.58548: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13273 1726853331.58552: variable '__network_wireless_connections_defined' from source: role '' defaults 13273 1726853331.59134: variable 'network_connections' from source: task vars 13273 1726853331.59149: variable 'controller_profile' from source: play vars 13273 1726853331.59350: variable 'controller_profile' from source: play vars 13273 1726853331.59390: variable '__network_packages_default_wireless' from source: role '' defaults 13273 1726853331.59606: variable '__network_wireless_connections_defined' from source: role '' defaults 13273 1726853331.60330: variable 'network_connections' from source: task vars 13273 1726853331.60341: variable 'controller_profile' from source: play vars 13273 1726853331.60506: variable 'controller_profile' from source: play vars 13273 1726853331.60530: variable '__network_packages_default_team' from source: role '' defaults 13273 1726853331.60835: variable '__network_team_connections_defined' from source: role '' defaults 13273 1726853331.61418: variable 'network_connections' from source: task vars 13273 1726853331.61428: variable 'controller_profile' from source: play vars 13273 1726853331.61589: variable 'controller_profile' from source: play vars 13273 1726853331.61651: variable '__network_service_name_default_initscripts' from source: role '' defaults 13273 1726853331.61765: variable '__network_service_name_default_initscripts' from source: role '' defaults 13273 1726853331.61819: variable '__network_packages_default_initscripts' from source: role '' defaults 13273 1726853331.61962: variable '__network_packages_default_initscripts' from source: role '' defaults 13273 1726853331.62412: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 13273 1726853331.63483: variable 'network_connections' from source: task vars 13273 1726853331.63578: variable 'controller_profile' from source: play vars 13273 1726853331.63621: variable 'controller_profile' from source: play vars 13273 1726853331.63666: variable 'ansible_distribution' from source: facts 13273 1726853331.63678: variable '__network_rh_distros' from source: role '' defaults 13273 1726853331.63980: variable 'ansible_distribution_major_version' from source: facts 13273 1726853331.63983: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 13273 1726853331.64074: variable 'ansible_distribution' from source: facts 13273 1726853331.64094: variable '__network_rh_distros' from source: role '' defaults 13273 1726853331.64276: variable 'ansible_distribution_major_version' from source: facts 13273 1726853331.64280: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 13273 1726853331.64475: variable 'ansible_distribution' from source: facts 13273 1726853331.64530: variable '__network_rh_distros' from source: role '' defaults 13273 1726853331.64540: variable 'ansible_distribution_major_version' from source: facts 13273 1726853331.64670: variable 'network_provider' from source: set_fact 13273 1726853331.64693: variable 'ansible_facts' from source: unknown 13273 1726853331.66211: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 13273 1726853331.66265: when evaluation is False, skipping this task 13273 1726853331.66275: _execute() done 13273 1726853331.66283: dumping result to json 13273 1726853331.66290: done dumping result, returning 13273 1726853331.66479: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages [02083763-bbaf-5fc3-657d-000000000174] 13273 1726853331.66483: sending task result for task 02083763-bbaf-5fc3-657d-000000000174 13273 1726853331.66566: done sending task result for task 02083763-bbaf-5fc3-657d-000000000174 13273 1726853331.66569: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 13273 1726853331.66637: no more pending results, returning what we have 13273 1726853331.66641: results queue empty 13273 1726853331.66643: checking for any_errors_fatal 13273 1726853331.66653: done checking for any_errors_fatal 13273 1726853331.66654: checking for max_fail_percentage 13273 1726853331.66656: done checking for max_fail_percentage 13273 1726853331.66657: checking to see if all hosts have failed and the running result is not ok 13273 1726853331.66658: done checking to see if all hosts have failed 13273 1726853331.66658: getting the remaining hosts for this loop 13273 1726853331.66660: done getting the remaining hosts for this loop 13273 1726853331.66664: getting the next task for host managed_node3 13273 1726853331.66674: done getting next task for host managed_node3 13273 1726853331.66679: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 13273 1726853331.66683: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13273 1726853331.66997: getting variables 13273 1726853331.66999: in VariableManager get_vars() 13273 1726853331.67057: Calling all_inventory to load vars for managed_node3 13273 1726853331.67061: Calling groups_inventory to load vars for managed_node3 13273 1726853331.67063: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853331.67083: Calling all_plugins_play to load vars for managed_node3 13273 1726853331.67087: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853331.67091: Calling groups_plugins_play to load vars for managed_node3 13273 1726853331.70107: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853331.73526: done with get_vars() 13273 1726853331.73563: done getting variables 13273 1726853331.73626: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 13:28:51 -0400 (0:00:00.277) 0:00:49.626 ****** 13273 1726853331.73781: entering _queue_task() for managed_node3/package 13273 1726853331.74554: worker is 1 (out of 1 available) 13273 1726853331.74568: exiting _queue_task() for managed_node3/package 13273 1726853331.74583: done queuing things up, now waiting for results queue to drain 13273 1726853331.74584: waiting for pending results... 13273 1726853331.75124: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 13273 1726853331.75356: in run() - task 02083763-bbaf-5fc3-657d-000000000175 13273 1726853331.75631: variable 'ansible_search_path' from source: unknown 13273 1726853331.75635: variable 'ansible_search_path' from source: unknown 13273 1726853331.75638: calling self._execute() 13273 1726853331.75695: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853331.75749: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853331.75956: variable 'omit' from source: magic vars 13273 1726853331.76593: variable 'ansible_distribution_major_version' from source: facts 13273 1726853331.76725: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853331.76967: variable 'network_state' from source: role '' defaults 13273 1726853331.76986: Evaluated conditional (network_state != {}): False 13273 1726853331.76995: when evaluation is False, skipping this task 13273 1726853331.77002: _execute() done 13273 1726853331.77009: dumping result to json 13273 1726853331.77016: done dumping result, returning 13273 1726853331.77029: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [02083763-bbaf-5fc3-657d-000000000175] 13273 1726853331.77044: sending task result for task 02083763-bbaf-5fc3-657d-000000000175 skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 13273 1726853331.77206: no more pending results, returning what we have 13273 1726853331.77210: results queue empty 13273 1726853331.77212: checking for any_errors_fatal 13273 1726853331.77220: done checking for any_errors_fatal 13273 1726853331.77221: checking for max_fail_percentage 13273 1726853331.77223: done checking for max_fail_percentage 13273 1726853331.77224: checking to see if all hosts have failed and the running result is not ok 13273 1726853331.77224: done checking to see if all hosts have failed 13273 1726853331.77225: getting the remaining hosts for this loop 13273 1726853331.77226: done getting the remaining hosts for this loop 13273 1726853331.77230: getting the next task for host managed_node3 13273 1726853331.77236: done getting next task for host managed_node3 13273 1726853331.77240: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 13273 1726853331.77244: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13273 1726853331.77276: getting variables 13273 1726853331.77278: in VariableManager get_vars() 13273 1726853331.77335: Calling all_inventory to load vars for managed_node3 13273 1726853331.77338: Calling groups_inventory to load vars for managed_node3 13273 1726853331.77341: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853331.77356: Calling all_plugins_play to load vars for managed_node3 13273 1726853331.77359: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853331.77362: Calling groups_plugins_play to load vars for managed_node3 13273 1726853331.78078: done sending task result for task 02083763-bbaf-5fc3-657d-000000000175 13273 1726853331.78081: WORKER PROCESS EXITING 13273 1726853331.80631: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853331.84029: done with get_vars() 13273 1726853331.84177: done getting variables 13273 1726853331.84236: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 13:28:51 -0400 (0:00:00.104) 0:00:49.733 ****** 13273 1726853331.84397: entering _queue_task() for managed_node3/package 13273 1726853331.85189: worker is 1 (out of 1 available) 13273 1726853331.85196: exiting _queue_task() for managed_node3/package 13273 1726853331.85205: done queuing things up, now waiting for results queue to drain 13273 1726853331.85206: waiting for pending results... 13273 1726853331.85790: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 13273 1726853331.85795: in run() - task 02083763-bbaf-5fc3-657d-000000000176 13273 1726853331.85798: variable 'ansible_search_path' from source: unknown 13273 1726853331.85800: variable 'ansible_search_path' from source: unknown 13273 1726853331.85997: calling self._execute() 13273 1726853331.86105: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853331.86119: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853331.86376: variable 'omit' from source: magic vars 13273 1726853331.86864: variable 'ansible_distribution_major_version' from source: facts 13273 1726853331.87176: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853331.87208: variable 'network_state' from source: role '' defaults 13273 1726853331.87224: Evaluated conditional (network_state != {}): False 13273 1726853331.87231: when evaluation is False, skipping this task 13273 1726853331.87237: _execute() done 13273 1726853331.87244: dumping result to json 13273 1726853331.87250: done dumping result, returning 13273 1726853331.87263: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [02083763-bbaf-5fc3-657d-000000000176] 13273 1726853331.87275: sending task result for task 02083763-bbaf-5fc3-657d-000000000176 13273 1726853331.87649: done sending task result for task 02083763-bbaf-5fc3-657d-000000000176 13273 1726853331.87653: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 13273 1726853331.87704: no more pending results, returning what we have 13273 1726853331.87708: results queue empty 13273 1726853331.87709: checking for any_errors_fatal 13273 1726853331.87717: done checking for any_errors_fatal 13273 1726853331.87717: checking for max_fail_percentage 13273 1726853331.87719: done checking for max_fail_percentage 13273 1726853331.87720: checking to see if all hosts have failed and the running result is not ok 13273 1726853331.87721: done checking to see if all hosts have failed 13273 1726853331.87721: getting the remaining hosts for this loop 13273 1726853331.87723: done getting the remaining hosts for this loop 13273 1726853331.87726: getting the next task for host managed_node3 13273 1726853331.87733: done getting next task for host managed_node3 13273 1726853331.87737: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 13273 1726853331.87742: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13273 1726853331.87769: getting variables 13273 1726853331.87770: in VariableManager get_vars() 13273 1726853331.87818: Calling all_inventory to load vars for managed_node3 13273 1726853331.87821: Calling groups_inventory to load vars for managed_node3 13273 1726853331.87824: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853331.87832: Calling all_plugins_play to load vars for managed_node3 13273 1726853331.87835: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853331.87837: Calling groups_plugins_play to load vars for managed_node3 13273 1726853331.90705: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853331.94121: done with get_vars() 13273 1726853331.94153: done getting variables 13273 1726853331.94280: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 13:28:51 -0400 (0:00:00.100) 0:00:49.833 ****** 13273 1726853331.94438: entering _queue_task() for managed_node3/service 13273 1726853331.95160: worker is 1 (out of 1 available) 13273 1726853331.95175: exiting _queue_task() for managed_node3/service 13273 1726853331.95189: done queuing things up, now waiting for results queue to drain 13273 1726853331.95190: waiting for pending results... 13273 1726853331.95541: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 13273 1726853331.95687: in run() - task 02083763-bbaf-5fc3-657d-000000000177 13273 1726853331.95703: variable 'ansible_search_path' from source: unknown 13273 1726853331.95706: variable 'ansible_search_path' from source: unknown 13273 1726853331.95745: calling self._execute() 13273 1726853331.95854: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853331.95861: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853331.95872: variable 'omit' from source: magic vars 13273 1726853331.96248: variable 'ansible_distribution_major_version' from source: facts 13273 1726853331.96268: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853331.96391: variable '__network_wireless_connections_defined' from source: role '' defaults 13273 1726853331.96586: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13273 1726853331.99397: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13273 1726853331.99476: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13273 1726853331.99513: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13273 1726853331.99554: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13273 1726853331.99586: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13273 1726853331.99707: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853331.99711: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853331.99764: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853331.99767: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853331.99784: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853331.99832: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853331.99859: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853331.99890: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853331.99977: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853331.99984: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853332.00037: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853332.00040: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853332.00043: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853332.00093: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853332.00096: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853332.00273: variable 'network_connections' from source: task vars 13273 1726853332.00286: variable 'controller_profile' from source: play vars 13273 1726853332.00369: variable 'controller_profile' from source: play vars 13273 1726853332.00470: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13273 1726853332.00759: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13273 1726853332.00763: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13273 1726853332.00765: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13273 1726853332.00767: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13273 1726853332.00783: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13273 1726853332.00805: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13273 1726853332.00837: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853332.00857: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13273 1726853332.00896: variable '__network_team_connections_defined' from source: role '' defaults 13273 1726853332.01210: variable 'network_connections' from source: task vars 13273 1726853332.01213: variable 'controller_profile' from source: play vars 13273 1726853332.01215: variable 'controller_profile' from source: play vars 13273 1726853332.01217: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 13273 1726853332.01220: when evaluation is False, skipping this task 13273 1726853332.01222: _execute() done 13273 1726853332.01224: dumping result to json 13273 1726853332.01226: done dumping result, returning 13273 1726853332.01228: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [02083763-bbaf-5fc3-657d-000000000177] 13273 1726853332.01230: sending task result for task 02083763-bbaf-5fc3-657d-000000000177 13273 1726853332.01325: done sending task result for task 02083763-bbaf-5fc3-657d-000000000177 13273 1726853332.01332: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 13273 1726853332.01409: no more pending results, returning what we have 13273 1726853332.01412: results queue empty 13273 1726853332.01413: checking for any_errors_fatal 13273 1726853332.01419: done checking for any_errors_fatal 13273 1726853332.01420: checking for max_fail_percentage 13273 1726853332.01423: done checking for max_fail_percentage 13273 1726853332.01424: checking to see if all hosts have failed and the running result is not ok 13273 1726853332.01424: done checking to see if all hosts have failed 13273 1726853332.01425: getting the remaining hosts for this loop 13273 1726853332.01427: done getting the remaining hosts for this loop 13273 1726853332.01430: getting the next task for host managed_node3 13273 1726853332.01437: done getting next task for host managed_node3 13273 1726853332.01440: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 13273 1726853332.01444: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13273 1726853332.01465: getting variables 13273 1726853332.01467: in VariableManager get_vars() 13273 1726853332.01687: Calling all_inventory to load vars for managed_node3 13273 1726853332.01690: Calling groups_inventory to load vars for managed_node3 13273 1726853332.01693: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853332.01706: Calling all_plugins_play to load vars for managed_node3 13273 1726853332.01710: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853332.01713: Calling groups_plugins_play to load vars for managed_node3 13273 1726853332.03921: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853332.05828: done with get_vars() 13273 1726853332.05859: done getting variables 13273 1726853332.05923: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 13:28:52 -0400 (0:00:00.115) 0:00:49.948 ****** 13273 1726853332.05967: entering _queue_task() for managed_node3/service 13273 1726853332.06341: worker is 1 (out of 1 available) 13273 1726853332.06355: exiting _queue_task() for managed_node3/service 13273 1726853332.06480: done queuing things up, now waiting for results queue to drain 13273 1726853332.06481: waiting for pending results... 13273 1726853332.06680: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 13273 1726853332.06878: in run() - task 02083763-bbaf-5fc3-657d-000000000178 13273 1726853332.06881: variable 'ansible_search_path' from source: unknown 13273 1726853332.06884: variable 'ansible_search_path' from source: unknown 13273 1726853332.06887: calling self._execute() 13273 1726853332.07003: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853332.07009: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853332.07025: variable 'omit' from source: magic vars 13273 1726853332.07776: variable 'ansible_distribution_major_version' from source: facts 13273 1726853332.07780: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853332.08135: variable 'network_provider' from source: set_fact 13273 1726853332.08139: variable 'network_state' from source: role '' defaults 13273 1726853332.08153: Evaluated conditional (network_provider == "nm" or network_state != {}): True 13273 1726853332.08159: variable 'omit' from source: magic vars 13273 1726853332.08260: variable 'omit' from source: magic vars 13273 1726853332.08289: variable 'network_service_name' from source: role '' defaults 13273 1726853332.08469: variable 'network_service_name' from source: role '' defaults 13273 1726853332.08782: variable '__network_provider_setup' from source: role '' defaults 13273 1726853332.08788: variable '__network_service_name_default_nm' from source: role '' defaults 13273 1726853332.08900: variable '__network_service_name_default_nm' from source: role '' defaults 13273 1726853332.08910: variable '__network_packages_default_nm' from source: role '' defaults 13273 1726853332.09084: variable '__network_packages_default_nm' from source: role '' defaults 13273 1726853332.09559: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13273 1726853332.12383: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13273 1726853332.12463: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13273 1726853332.12505: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13273 1726853332.12575: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13273 1726853332.12579: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13273 1726853332.12655: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853332.12688: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853332.12710: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853332.12755: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853332.12775: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853332.12839: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853332.12842: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853332.12858: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853332.12900: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853332.12914: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853332.13178: variable '__network_packages_default_gobject_packages' from source: role '' defaults 13273 1726853332.13287: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853332.13376: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853332.13380: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853332.13384: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853332.13393: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853332.13486: variable 'ansible_python' from source: facts 13273 1726853332.13514: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 13273 1726853332.13723: variable '__network_wpa_supplicant_required' from source: role '' defaults 13273 1726853332.13807: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 13273 1726853332.13938: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853332.13976: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853332.14006: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853332.14049: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853332.14275: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853332.14279: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853332.14289: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853332.14291: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853332.14293: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853332.14295: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853332.14395: variable 'network_connections' from source: task vars 13273 1726853332.14407: variable 'controller_profile' from source: play vars 13273 1726853332.14485: variable 'controller_profile' from source: play vars 13273 1726853332.14601: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13273 1726853332.14757: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13273 1726853332.14802: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13273 1726853332.14837: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13273 1726853332.14876: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13273 1726853332.14918: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13273 1726853332.14942: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13273 1726853332.14966: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853332.14990: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13273 1726853332.15029: variable '__network_wireless_connections_defined' from source: role '' defaults 13273 1726853332.15209: variable 'network_connections' from source: task vars 13273 1726853332.15215: variable 'controller_profile' from source: play vars 13273 1726853332.15269: variable 'controller_profile' from source: play vars 13273 1726853332.15295: variable '__network_packages_default_wireless' from source: role '' defaults 13273 1726853332.15348: variable '__network_wireless_connections_defined' from source: role '' defaults 13273 1726853332.15532: variable 'network_connections' from source: task vars 13273 1726853332.15535: variable 'controller_profile' from source: play vars 13273 1726853332.15589: variable 'controller_profile' from source: play vars 13273 1726853332.15605: variable '__network_packages_default_team' from source: role '' defaults 13273 1726853332.15681: variable '__network_team_connections_defined' from source: role '' defaults 13273 1726853332.15858: variable 'network_connections' from source: task vars 13273 1726853332.15861: variable 'controller_profile' from source: play vars 13273 1726853332.16080: variable 'controller_profile' from source: play vars 13273 1726853332.16083: variable '__network_service_name_default_initscripts' from source: role '' defaults 13273 1726853332.16086: variable '__network_service_name_default_initscripts' from source: role '' defaults 13273 1726853332.16088: variable '__network_packages_default_initscripts' from source: role '' defaults 13273 1726853332.16090: variable '__network_packages_default_initscripts' from source: role '' defaults 13273 1726853332.16246: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 13273 1726853332.16712: variable 'network_connections' from source: task vars 13273 1726853332.16715: variable 'controller_profile' from source: play vars 13273 1726853332.16792: variable 'controller_profile' from source: play vars 13273 1726853332.16796: variable 'ansible_distribution' from source: facts 13273 1726853332.16798: variable '__network_rh_distros' from source: role '' defaults 13273 1726853332.16806: variable 'ansible_distribution_major_version' from source: facts 13273 1726853332.16817: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 13273 1726853332.17091: variable 'ansible_distribution' from source: facts 13273 1726853332.17095: variable '__network_rh_distros' from source: role '' defaults 13273 1726853332.17097: variable 'ansible_distribution_major_version' from source: facts 13273 1726853332.17099: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 13273 1726853332.17259: variable 'ansible_distribution' from source: facts 13273 1726853332.17262: variable '__network_rh_distros' from source: role '' defaults 13273 1726853332.17265: variable 'ansible_distribution_major_version' from source: facts 13273 1726853332.17306: variable 'network_provider' from source: set_fact 13273 1726853332.17329: variable 'omit' from source: magic vars 13273 1726853332.17420: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13273 1726853332.17425: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13273 1726853332.17427: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13273 1726853332.17473: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853332.17477: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853332.17499: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13273 1726853332.17502: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853332.17504: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853332.17649: Set connection var ansible_connection to ssh 13273 1726853332.17652: Set connection var ansible_module_compression to ZIP_DEFLATED 13273 1726853332.17654: Set connection var ansible_shell_executable to /bin/sh 13273 1726853332.17656: Set connection var ansible_shell_type to sh 13273 1726853332.17658: Set connection var ansible_pipelining to False 13273 1726853332.17660: Set connection var ansible_timeout to 10 13273 1726853332.17741: variable 'ansible_shell_executable' from source: unknown 13273 1726853332.17745: variable 'ansible_connection' from source: unknown 13273 1726853332.17747: variable 'ansible_module_compression' from source: unknown 13273 1726853332.17750: variable 'ansible_shell_type' from source: unknown 13273 1726853332.17754: variable 'ansible_shell_executable' from source: unknown 13273 1726853332.17757: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853332.17759: variable 'ansible_pipelining' from source: unknown 13273 1726853332.17761: variable 'ansible_timeout' from source: unknown 13273 1726853332.17763: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853332.17851: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13273 1726853332.17859: variable 'omit' from source: magic vars 13273 1726853332.17862: starting attempt loop 13273 1726853332.17864: running the handler 13273 1726853332.17906: variable 'ansible_facts' from source: unknown 13273 1726853332.18740: _low_level_execute_command(): starting 13273 1726853332.18752: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13273 1726853332.19697: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13273 1726853332.19731: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853332.19734: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853332.19745: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13273 1726853332.19783: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 13273 1726853332.19786: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853332.19790: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853332.19792: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853332.19795: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853332.19846: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853332.19857: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853332.19860: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853332.19940: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853332.21863: stdout chunk (state=3): >>>/root <<< 13273 1726853332.21910: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853332.21913: stdout chunk (state=3): >>><<< 13273 1726853332.22079: stderr chunk (state=3): >>><<< 13273 1726853332.22085: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853332.22088: _low_level_execute_command(): starting 13273 1726853332.22091: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853332.22038-15572-133028197927835 `" && echo ansible-tmp-1726853332.22038-15572-133028197927835="` echo /root/.ansible/tmp/ansible-tmp-1726853332.22038-15572-133028197927835 `" ) && sleep 0' 13273 1726853332.22696: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13273 1726853332.22786: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853332.22802: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853332.22815: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853332.22832: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853332.22918: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853332.24963: stdout chunk (state=3): >>>ansible-tmp-1726853332.22038-15572-133028197927835=/root/.ansible/tmp/ansible-tmp-1726853332.22038-15572-133028197927835 <<< 13273 1726853332.25420: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853332.25425: stdout chunk (state=3): >>><<< 13273 1726853332.25427: stderr chunk (state=3): >>><<< 13273 1726853332.25430: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853332.22038-15572-133028197927835=/root/.ansible/tmp/ansible-tmp-1726853332.22038-15572-133028197927835 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853332.25432: variable 'ansible_module_compression' from source: unknown 13273 1726853332.25436: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13273pukop8ph/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 13273 1726853332.25485: variable 'ansible_facts' from source: unknown 13273 1726853332.25763: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853332.22038-15572-133028197927835/AnsiballZ_systemd.py 13273 1726853332.26025: Sending initial data 13273 1726853332.26029: Sent initial data (154 bytes) 13273 1726853332.27043: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13273 1726853332.27293: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853332.27594: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853332.27670: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853332.29309: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 13273 1726853332.29323: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13273 1726853332.29416: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13273 1726853332.29462: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13273pukop8ph/tmp6p35snpp /root/.ansible/tmp/ansible-tmp-1726853332.22038-15572-133028197927835/AnsiballZ_systemd.py <<< 13273 1726853332.29498: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853332.22038-15572-133028197927835/AnsiballZ_systemd.py" <<< 13273 1726853332.29556: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13273pukop8ph/tmp6p35snpp" to remote "/root/.ansible/tmp/ansible-tmp-1726853332.22038-15572-133028197927835/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853332.22038-15572-133028197927835/AnsiballZ_systemd.py" <<< 13273 1726853332.31627: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853332.31765: stderr chunk (state=3): >>><<< 13273 1726853332.31769: stdout chunk (state=3): >>><<< 13273 1726853332.31773: done transferring module to remote 13273 1726853332.31776: _low_level_execute_command(): starting 13273 1726853332.31778: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853332.22038-15572-133028197927835/ /root/.ansible/tmp/ansible-tmp-1726853332.22038-15572-133028197927835/AnsiballZ_systemd.py && sleep 0' 13273 1726853332.32577: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13273 1726853332.32584: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853332.32608: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853332.32620: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13273 1726853332.32636: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 13273 1726853332.32651: stderr chunk (state=3): >>>debug2: match not found <<< 13273 1726853332.32667: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853332.32682: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13273 1726853332.32691: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address <<< 13273 1726853332.32697: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13273 1726853332.32706: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853332.32745: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853332.32751: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13273 1726853332.32753: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 13273 1726853332.32756: stderr chunk (state=3): >>>debug2: match found <<< 13273 1726853332.32758: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853332.32853: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853332.32857: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853332.32907: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853332.32982: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853332.34979: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853332.34983: stdout chunk (state=3): >>><<< 13273 1726853332.34986: stderr chunk (state=3): >>><<< 13273 1726853332.35032: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853332.35101: _low_level_execute_command(): starting 13273 1726853332.35107: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853332.22038-15572-133028197927835/AnsiballZ_systemd.py && sleep 0' 13273 1726853332.35782: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13273 1726853332.35798: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853332.35821: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853332.35841: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13273 1726853332.35887: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853332.35956: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853332.35978: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853332.36010: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853332.36127: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853332.65856: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "705", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:21:20 EDT", "ExecMainStartTimestampMonotonic": "24298536", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:21:20 EDT", "ExecMainHandoffTimestampMonotonic": "24318182", "ExecMainPID": "705", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10596352", "MemoryPeak": "14114816", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3332538368", "EffectiveMemoryMax": "3702865920", "EffectiveMemoryHigh": "3702865920", "CPUUsageNSec": "1331219000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpR<<< 13273 1726853332.65881: stdout chunk (state=3): >>>eceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice sysinit.target dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "network.target shutdown.target cloud-init.service NetworkManager-wait-online.service multi-user.target", "After": "sysinit.target systemd-journald.socket basic.target cloud-init-local.service network-pre.target dbus.socket system.slice dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:26:57 EDT", "StateChangeTimestampMonotonic": "361843458", "InactiveExitTimestamp": "Fri 2024-09-20 13:21:20 EDT", "InactiveExitTimestampMonotonic": "24299070", "ActiveEnterTimestamp": "Fri 2024-09-20 13:21:20 EDT", "ActiveEnterTimestampMonotonic": "24855925", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:21:20 EDT", "ConditionTimestampMonotonic": "24297535", "AssertTimestamp": "Fri 2024-09-20 13:21:20 EDT", "AssertTimestampMonotonic": "24297537", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "125a1bdc44cb4bffa8aeca788d2f2fa3", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 13273 1726853332.67977: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 13273 1726853332.67982: stdout chunk (state=3): >>><<< 13273 1726853332.67985: stderr chunk (state=3): >>><<< 13273 1726853332.67987: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "705", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:21:20 EDT", "ExecMainStartTimestampMonotonic": "24298536", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:21:20 EDT", "ExecMainHandoffTimestampMonotonic": "24318182", "ExecMainPID": "705", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10596352", "MemoryPeak": "14114816", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3332538368", "EffectiveMemoryMax": "3702865920", "EffectiveMemoryHigh": "3702865920", "CPUUsageNSec": "1331219000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice sysinit.target dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "network.target shutdown.target cloud-init.service NetworkManager-wait-online.service multi-user.target", "After": "sysinit.target systemd-journald.socket basic.target cloud-init-local.service network-pre.target dbus.socket system.slice dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:26:57 EDT", "StateChangeTimestampMonotonic": "361843458", "InactiveExitTimestamp": "Fri 2024-09-20 13:21:20 EDT", "InactiveExitTimestampMonotonic": "24299070", "ActiveEnterTimestamp": "Fri 2024-09-20 13:21:20 EDT", "ActiveEnterTimestampMonotonic": "24855925", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:21:20 EDT", "ConditionTimestampMonotonic": "24297535", "AssertTimestamp": "Fri 2024-09-20 13:21:20 EDT", "AssertTimestampMonotonic": "24297537", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "125a1bdc44cb4bffa8aeca788d2f2fa3", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 13273 1726853332.68126: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853332.22038-15572-133028197927835/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13273 1726853332.68155: _low_level_execute_command(): starting 13273 1726853332.68165: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853332.22038-15572-133028197927835/ > /dev/null 2>&1 && sleep 0' 13273 1726853332.68810: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13273 1726853332.68825: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853332.68840: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853332.68885: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853332.68905: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13273 1726853332.68987: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 13273 1726853332.69007: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853332.69028: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853332.69123: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853332.71092: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853332.71096: stdout chunk (state=3): >>><<< 13273 1726853332.71098: stderr chunk (state=3): >>><<< 13273 1726853332.71177: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853332.71181: handler run complete 13273 1726853332.71223: attempt loop complete, returning result 13273 1726853332.71240: _execute() done 13273 1726853332.71251: dumping result to json 13273 1726853332.71545: done dumping result, returning 13273 1726853332.71550: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [02083763-bbaf-5fc3-657d-000000000178] 13273 1726853332.71553: sending task result for task 02083763-bbaf-5fc3-657d-000000000178 13273 1726853332.71711: done sending task result for task 02083763-bbaf-5fc3-657d-000000000178 13273 1726853332.71714: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 13273 1726853332.71781: no more pending results, returning what we have 13273 1726853332.71785: results queue empty 13273 1726853332.71786: checking for any_errors_fatal 13273 1726853332.71791: done checking for any_errors_fatal 13273 1726853332.71792: checking for max_fail_percentage 13273 1726853332.71794: done checking for max_fail_percentage 13273 1726853332.71795: checking to see if all hosts have failed and the running result is not ok 13273 1726853332.71796: done checking to see if all hosts have failed 13273 1726853332.71796: getting the remaining hosts for this loop 13273 1726853332.71798: done getting the remaining hosts for this loop 13273 1726853332.71801: getting the next task for host managed_node3 13273 1726853332.71809: done getting next task for host managed_node3 13273 1726853332.71813: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 13273 1726853332.71817: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13273 1726853332.71834: getting variables 13273 1726853332.71836: in VariableManager get_vars() 13273 1726853332.72091: Calling all_inventory to load vars for managed_node3 13273 1726853332.72094: Calling groups_inventory to load vars for managed_node3 13273 1726853332.72097: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853332.72112: Calling all_plugins_play to load vars for managed_node3 13273 1726853332.72116: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853332.72119: Calling groups_plugins_play to load vars for managed_node3 13273 1726853332.75698: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853332.78131: done with get_vars() 13273 1726853332.78166: done getting variables 13273 1726853332.78339: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 13:28:52 -0400 (0:00:00.725) 0:00:50.674 ****** 13273 1726853332.78548: entering _queue_task() for managed_node3/service 13273 1726853332.79296: worker is 1 (out of 1 available) 13273 1726853332.79306: exiting _queue_task() for managed_node3/service 13273 1726853332.79318: done queuing things up, now waiting for results queue to drain 13273 1726853332.79319: waiting for pending results... 13273 1726853332.80103: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 13273 1726853332.80108: in run() - task 02083763-bbaf-5fc3-657d-000000000179 13273 1726853332.80112: variable 'ansible_search_path' from source: unknown 13273 1726853332.80115: variable 'ansible_search_path' from source: unknown 13273 1726853332.80210: calling self._execute() 13273 1726853332.80313: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853332.80321: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853332.80330: variable 'omit' from source: magic vars 13273 1726853332.80886: variable 'ansible_distribution_major_version' from source: facts 13273 1726853332.80905: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853332.81036: variable 'network_provider' from source: set_fact 13273 1726853332.81051: Evaluated conditional (network_provider == "nm"): True 13273 1726853332.81159: variable '__network_wpa_supplicant_required' from source: role '' defaults 13273 1726853332.81268: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 13273 1726853332.81509: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13273 1726853332.83804: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13273 1726853332.84015: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13273 1726853332.84029: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13273 1726853332.84232: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13273 1726853332.84236: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13273 1726853332.84405: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853332.84441: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853332.84677: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853332.84681: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853332.84684: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853332.84742: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853332.84840: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853332.84877: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853332.84988: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853332.85111: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853332.85189: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853332.85257: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853332.85299: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853332.85374: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853332.85395: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853332.85811: variable 'network_connections' from source: task vars 13273 1726853332.85852: variable 'controller_profile' from source: play vars 13273 1726853332.86104: variable 'controller_profile' from source: play vars 13273 1726853332.86199: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13273 1726853332.86589: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13273 1726853332.86826: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13273 1726853332.87298: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13273 1726853332.87301: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13273 1726853332.87304: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13273 1726853332.87306: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13273 1726853332.87309: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853332.87313: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13273 1726853332.87519: variable '__network_wireless_connections_defined' from source: role '' defaults 13273 1726853332.87991: variable 'network_connections' from source: task vars 13273 1726853332.88002: variable 'controller_profile' from source: play vars 13273 1726853332.88261: variable 'controller_profile' from source: play vars 13273 1726853332.88298: Evaluated conditional (__network_wpa_supplicant_required): False 13273 1726853332.88351: when evaluation is False, skipping this task 13273 1726853332.88358: _execute() done 13273 1726853332.88365: dumping result to json 13273 1726853332.88374: done dumping result, returning 13273 1726853332.88387: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [02083763-bbaf-5fc3-657d-000000000179] 13273 1726853332.88477: sending task result for task 02083763-bbaf-5fc3-657d-000000000179 13273 1726853332.88682: done sending task result for task 02083763-bbaf-5fc3-657d-000000000179 13273 1726853332.88685: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 13273 1726853332.88856: no more pending results, returning what we have 13273 1726853332.88859: results queue empty 13273 1726853332.88860: checking for any_errors_fatal 13273 1726853332.88934: done checking for any_errors_fatal 13273 1726853332.88936: checking for max_fail_percentage 13273 1726853332.88938: done checking for max_fail_percentage 13273 1726853332.88939: checking to see if all hosts have failed and the running result is not ok 13273 1726853332.88940: done checking to see if all hosts have failed 13273 1726853332.88940: getting the remaining hosts for this loop 13273 1726853332.88942: done getting the remaining hosts for this loop 13273 1726853332.88949: getting the next task for host managed_node3 13273 1726853332.88959: done getting next task for host managed_node3 13273 1726853332.88964: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 13273 1726853332.88968: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13273 1726853332.89095: getting variables 13273 1726853332.89097: in VariableManager get_vars() 13273 1726853332.89152: Calling all_inventory to load vars for managed_node3 13273 1726853332.89156: Calling groups_inventory to load vars for managed_node3 13273 1726853332.89158: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853332.89168: Calling all_plugins_play to load vars for managed_node3 13273 1726853332.89577: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853332.89583: Calling groups_plugins_play to load vars for managed_node3 13273 1726853332.93914: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853332.97748: done with get_vars() 13273 1726853332.97789: done getting variables 13273 1726853332.97969: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 13:28:52 -0400 (0:00:00.194) 0:00:50.869 ****** 13273 1726853332.98006: entering _queue_task() for managed_node3/service 13273 1726853332.98826: worker is 1 (out of 1 available) 13273 1726853332.98837: exiting _queue_task() for managed_node3/service 13273 1726853332.98856: done queuing things up, now waiting for results queue to drain 13273 1726853332.98857: waiting for pending results... 13273 1726853332.99790: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service 13273 1726853333.00080: in run() - task 02083763-bbaf-5fc3-657d-00000000017a 13273 1726853333.00083: variable 'ansible_search_path' from source: unknown 13273 1726853333.00086: variable 'ansible_search_path' from source: unknown 13273 1726853333.00088: calling self._execute() 13273 1726853333.00269: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853333.00678: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853333.00682: variable 'omit' from source: magic vars 13273 1726853333.01389: variable 'ansible_distribution_major_version' from source: facts 13273 1726853333.01400: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853333.01586: variable 'network_provider' from source: set_fact 13273 1726853333.01589: Evaluated conditional (network_provider == "initscripts"): False 13273 1726853333.01592: when evaluation is False, skipping this task 13273 1726853333.01595: _execute() done 13273 1726853333.01599: dumping result to json 13273 1726853333.01602: done dumping result, returning 13273 1726853333.01611: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service [02083763-bbaf-5fc3-657d-00000000017a] 13273 1726853333.01616: sending task result for task 02083763-bbaf-5fc3-657d-00000000017a 13273 1726853333.01825: done sending task result for task 02083763-bbaf-5fc3-657d-00000000017a 13273 1726853333.01827: WORKER PROCESS EXITING skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 13273 1726853333.02111: no more pending results, returning what we have 13273 1726853333.02114: results queue empty 13273 1726853333.02115: checking for any_errors_fatal 13273 1726853333.02120: done checking for any_errors_fatal 13273 1726853333.02121: checking for max_fail_percentage 13273 1726853333.02123: done checking for max_fail_percentage 13273 1726853333.02124: checking to see if all hosts have failed and the running result is not ok 13273 1726853333.02124: done checking to see if all hosts have failed 13273 1726853333.02125: getting the remaining hosts for this loop 13273 1726853333.02126: done getting the remaining hosts for this loop 13273 1726853333.02129: getting the next task for host managed_node3 13273 1726853333.02136: done getting next task for host managed_node3 13273 1726853333.02140: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 13273 1726853333.02145: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13273 1726853333.02168: getting variables 13273 1726853333.02169: in VariableManager get_vars() 13273 1726853333.02220: Calling all_inventory to load vars for managed_node3 13273 1726853333.02225: Calling groups_inventory to load vars for managed_node3 13273 1726853333.02227: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853333.02237: Calling all_plugins_play to load vars for managed_node3 13273 1726853333.02240: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853333.02243: Calling groups_plugins_play to load vars for managed_node3 13273 1726853333.06121: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853333.09549: done with get_vars() 13273 1726853333.09585: done getting variables 13273 1726853333.09649: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 13:28:53 -0400 (0:00:00.118) 0:00:50.988 ****** 13273 1726853333.09890: entering _queue_task() for managed_node3/copy 13273 1726853333.10901: worker is 1 (out of 1 available) 13273 1726853333.10911: exiting _queue_task() for managed_node3/copy 13273 1726853333.10921: done queuing things up, now waiting for results queue to drain 13273 1726853333.10922: waiting for pending results... 13273 1726853333.11089: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 13273 1726853333.11539: in run() - task 02083763-bbaf-5fc3-657d-00000000017b 13273 1726853333.11543: variable 'ansible_search_path' from source: unknown 13273 1726853333.11545: variable 'ansible_search_path' from source: unknown 13273 1726853333.11547: calling self._execute() 13273 1726853333.11795: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853333.11807: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853333.11816: variable 'omit' from source: magic vars 13273 1726853333.12676: variable 'ansible_distribution_major_version' from source: facts 13273 1726853333.12680: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853333.12954: variable 'network_provider' from source: set_fact 13273 1726853333.12957: Evaluated conditional (network_provider == "initscripts"): False 13273 1726853333.12960: when evaluation is False, skipping this task 13273 1726853333.12962: _execute() done 13273 1726853333.12964: dumping result to json 13273 1726853333.12977: done dumping result, returning 13273 1726853333.12988: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [02083763-bbaf-5fc3-657d-00000000017b] 13273 1726853333.12994: sending task result for task 02083763-bbaf-5fc3-657d-00000000017b 13273 1726853333.13104: done sending task result for task 02083763-bbaf-5fc3-657d-00000000017b 13273 1726853333.13106: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 13273 1726853333.13169: no more pending results, returning what we have 13273 1726853333.13180: results queue empty 13273 1726853333.13182: checking for any_errors_fatal 13273 1726853333.13188: done checking for any_errors_fatal 13273 1726853333.13188: checking for max_fail_percentage 13273 1726853333.13190: done checking for max_fail_percentage 13273 1726853333.13191: checking to see if all hosts have failed and the running result is not ok 13273 1726853333.13192: done checking to see if all hosts have failed 13273 1726853333.13193: getting the remaining hosts for this loop 13273 1726853333.13194: done getting the remaining hosts for this loop 13273 1726853333.13198: getting the next task for host managed_node3 13273 1726853333.13205: done getting next task for host managed_node3 13273 1726853333.13209: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 13273 1726853333.13214: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13273 1726853333.13241: getting variables 13273 1726853333.13243: in VariableManager get_vars() 13273 1726853333.13408: Calling all_inventory to load vars for managed_node3 13273 1726853333.13412: Calling groups_inventory to load vars for managed_node3 13273 1726853333.13414: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853333.13427: Calling all_plugins_play to load vars for managed_node3 13273 1726853333.13431: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853333.13434: Calling groups_plugins_play to load vars for managed_node3 13273 1726853333.16267: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853333.19809: done with get_vars() 13273 1726853333.19833: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 13:28:53 -0400 (0:00:00.102) 0:00:51.090 ****** 13273 1726853333.20134: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 13273 1726853333.21104: worker is 1 (out of 1 available) 13273 1726853333.21115: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 13273 1726853333.21127: done queuing things up, now waiting for results queue to drain 13273 1726853333.21128: waiting for pending results... 13273 1726853333.21438: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 13273 1726853333.21731: in run() - task 02083763-bbaf-5fc3-657d-00000000017c 13273 1726853333.21898: variable 'ansible_search_path' from source: unknown 13273 1726853333.21902: variable 'ansible_search_path' from source: unknown 13273 1726853333.21937: calling self._execute() 13273 1726853333.22154: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853333.22165: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853333.22172: variable 'omit' from source: magic vars 13273 1726853333.23249: variable 'ansible_distribution_major_version' from source: facts 13273 1726853333.23253: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853333.23255: variable 'omit' from source: magic vars 13273 1726853333.23379: variable 'omit' from source: magic vars 13273 1726853333.23756: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13273 1726853333.28583: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13273 1726853333.28656: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13273 1726853333.28747: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13273 1726853333.28785: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13273 1726853333.28811: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13273 1726853333.29096: variable 'network_provider' from source: set_fact 13273 1726853333.29401: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13273 1726853333.29515: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13273 1726853333.29562: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13273 1726853333.29691: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13273 1726853333.29713: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13273 1726853333.29892: variable 'omit' from source: magic vars 13273 1726853333.30188: variable 'omit' from source: magic vars 13273 1726853333.30331: variable 'network_connections' from source: task vars 13273 1726853333.30341: variable 'controller_profile' from source: play vars 13273 1726853333.30522: variable 'controller_profile' from source: play vars 13273 1726853333.30859: variable 'omit' from source: magic vars 13273 1726853333.30868: variable '__lsr_ansible_managed' from source: task vars 13273 1726853333.31074: variable '__lsr_ansible_managed' from source: task vars 13273 1726853333.31444: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 13273 1726853333.31978: Loaded config def from plugin (lookup/template) 13273 1726853333.31982: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 13273 1726853333.32057: File lookup term: get_ansible_managed.j2 13273 1726853333.32061: variable 'ansible_search_path' from source: unknown 13273 1726853333.32064: evaluation_path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 13273 1726853333.32147: search_path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 13273 1726853333.32167: variable 'ansible_search_path' from source: unknown 13273 1726853333.47462: variable 'ansible_managed' from source: unknown 13273 1726853333.47686: variable 'omit' from source: magic vars 13273 1726853333.47717: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13273 1726853333.47742: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13273 1726853333.47763: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13273 1726853333.47782: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853333.47792: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853333.47824: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13273 1726853333.47827: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853333.47830: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853333.47981: Set connection var ansible_connection to ssh 13273 1726853333.47984: Set connection var ansible_module_compression to ZIP_DEFLATED 13273 1726853333.47986: Set connection var ansible_shell_executable to /bin/sh 13273 1726853333.47989: Set connection var ansible_shell_type to sh 13273 1726853333.47991: Set connection var ansible_pipelining to False 13273 1726853333.47993: Set connection var ansible_timeout to 10 13273 1726853333.47995: variable 'ansible_shell_executable' from source: unknown 13273 1726853333.47997: variable 'ansible_connection' from source: unknown 13273 1726853333.48000: variable 'ansible_module_compression' from source: unknown 13273 1726853333.48002: variable 'ansible_shell_type' from source: unknown 13273 1726853333.48003: variable 'ansible_shell_executable' from source: unknown 13273 1726853333.48006: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853333.48008: variable 'ansible_pipelining' from source: unknown 13273 1726853333.48010: variable 'ansible_timeout' from source: unknown 13273 1726853333.48011: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853333.48137: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 13273 1726853333.48378: variable 'omit' from source: magic vars 13273 1726853333.48381: starting attempt loop 13273 1726853333.48384: running the handler 13273 1726853333.48386: _low_level_execute_command(): starting 13273 1726853333.48388: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13273 1726853333.48921: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13273 1726853333.48933: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853333.48944: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853333.48986: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 13273 1726853333.48993: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853333.49116: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853333.49120: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853333.49259: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853333.50972: stdout chunk (state=3): >>>/root <<< 13273 1726853333.51133: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853333.51139: stdout chunk (state=3): >>><<< 13273 1726853333.51148: stderr chunk (state=3): >>><<< 13273 1726853333.51312: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853333.51325: _low_level_execute_command(): starting 13273 1726853333.51331: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853333.5131168-15621-116954152020058 `" && echo ansible-tmp-1726853333.5131168-15621-116954152020058="` echo /root/.ansible/tmp/ansible-tmp-1726853333.5131168-15621-116954152020058 `" ) && sleep 0' 13273 1726853333.52567: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853333.52573: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853333.52725: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853333.52757: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853333.52763: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853333.52965: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853333.53036: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853333.55012: stdout chunk (state=3): >>>ansible-tmp-1726853333.5131168-15621-116954152020058=/root/.ansible/tmp/ansible-tmp-1726853333.5131168-15621-116954152020058 <<< 13273 1726853333.55120: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853333.55165: stderr chunk (state=3): >>><<< 13273 1726853333.55243: stdout chunk (state=3): >>><<< 13273 1726853333.55260: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853333.5131168-15621-116954152020058=/root/.ansible/tmp/ansible-tmp-1726853333.5131168-15621-116954152020058 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853333.55380: variable 'ansible_module_compression' from source: unknown 13273 1726853333.55477: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13273pukop8ph/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 13273 1726853333.55606: variable 'ansible_facts' from source: unknown 13273 1726853333.55911: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853333.5131168-15621-116954152020058/AnsiballZ_network_connections.py 13273 1726853333.56163: Sending initial data 13273 1726853333.56166: Sent initial data (168 bytes) 13273 1726853333.57829: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13273 1726853333.57941: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853333.57979: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853333.57996: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853333.58020: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853333.58146: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853333.59835: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13273 1726853333.60267: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13273 1726853333.60272: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13273pukop8ph/tmp6jzcuz9t /root/.ansible/tmp/ansible-tmp-1726853333.5131168-15621-116954152020058/AnsiballZ_network_connections.py <<< 13273 1726853333.60275: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853333.5131168-15621-116954152020058/AnsiballZ_network_connections.py" <<< 13273 1726853333.60348: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13273pukop8ph/tmp6jzcuz9t" to remote "/root/.ansible/tmp/ansible-tmp-1726853333.5131168-15621-116954152020058/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853333.5131168-15621-116954152020058/AnsiballZ_network_connections.py" <<< 13273 1726853333.61794: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853333.61919: stderr chunk (state=3): >>><<< 13273 1726853333.61924: stdout chunk (state=3): >>><<< 13273 1726853333.62178: done transferring module to remote 13273 1726853333.62181: _low_level_execute_command(): starting 13273 1726853333.62184: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853333.5131168-15621-116954152020058/ /root/.ansible/tmp/ansible-tmp-1726853333.5131168-15621-116954152020058/AnsiballZ_network_connections.py && sleep 0' 13273 1726853333.62687: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853333.62716: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853333.62733: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853333.62781: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853333.62875: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853333.64784: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853333.64788: stdout chunk (state=3): >>><<< 13273 1726853333.64794: stderr chunk (state=3): >>><<< 13273 1726853333.64811: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853333.64814: _low_level_execute_command(): starting 13273 1726853333.64819: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853333.5131168-15621-116954152020058/AnsiballZ_network_connections.py && sleep 0' 13273 1726853333.65428: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13273 1726853333.65435: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853333.65445: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853333.65463: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13273 1726853333.65508: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 13273 1726853333.65512: stderr chunk (state=3): >>>debug2: match not found <<< 13273 1726853333.65514: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853333.65517: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13273 1726853333.65525: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address <<< 13273 1726853333.65529: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13273 1726853333.65531: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853333.65534: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853333.65543: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13273 1726853333.65622: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 13273 1726853333.65625: stderr chunk (state=3): >>>debug2: match found <<< 13273 1726853333.65627: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853333.65639: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853333.65653: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853333.65677: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853333.65768: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853334.05444: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload__rodny2d/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload__rodny2d/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail <<< 13273 1726853334.05458: stdout chunk (state=3): >>>ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0/3903334e-7358-4806-a114-5ea6dbf2cacf: error=unknown <<< 13273 1726853334.05649: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "down", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "down", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 13273 1726853334.07691: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853334.07779: stderr chunk (state=3): >>>Shared connection to 10.31.11.217 closed. <<< 13273 1726853334.07782: stderr chunk (state=3): >>><<< 13273 1726853334.07799: stdout chunk (state=3): >>><<< 13273 1726853334.07856: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload__rodny2d/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload__rodny2d/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0/3903334e-7358-4806-a114-5ea6dbf2cacf: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "down", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "down", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 13273 1726853334.07881: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'bond0', 'state': 'down', 'persistent_state': 'absent'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853333.5131168-15621-116954152020058/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13273 1726853334.07894: _low_level_execute_command(): starting 13273 1726853334.07903: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853333.5131168-15621-116954152020058/ > /dev/null 2>&1 && sleep 0' 13273 1726853334.08491: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13273 1726853334.08495: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13273 1726853334.08497: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853334.08499: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853334.08501: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853334.08503: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 13273 1726853334.08505: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853334.08554: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853334.08557: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853334.08563: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853334.08624: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853334.10636: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853334.10682: stderr chunk (state=3): >>><<< 13273 1726853334.10686: stdout chunk (state=3): >>><<< 13273 1726853334.10879: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853334.10888: handler run complete 13273 1726853334.10891: attempt loop complete, returning result 13273 1726853334.10894: _execute() done 13273 1726853334.10896: dumping result to json 13273 1726853334.10899: done dumping result, returning 13273 1726853334.10901: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [02083763-bbaf-5fc3-657d-00000000017c] 13273 1726853334.10903: sending task result for task 02083763-bbaf-5fc3-657d-00000000017c 13273 1726853334.10987: done sending task result for task 02083763-bbaf-5fc3-657d-00000000017c 13273 1726853334.10991: WORKER PROCESS EXITING changed: [managed_node3] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "bond0", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 13273 1726853334.11097: no more pending results, returning what we have 13273 1726853334.11100: results queue empty 13273 1726853334.11101: checking for any_errors_fatal 13273 1726853334.11106: done checking for any_errors_fatal 13273 1726853334.11107: checking for max_fail_percentage 13273 1726853334.11109: done checking for max_fail_percentage 13273 1726853334.11109: checking to see if all hosts have failed and the running result is not ok 13273 1726853334.11110: done checking to see if all hosts have failed 13273 1726853334.11111: getting the remaining hosts for this loop 13273 1726853334.11112: done getting the remaining hosts for this loop 13273 1726853334.11115: getting the next task for host managed_node3 13273 1726853334.11122: done getting next task for host managed_node3 13273 1726853334.11125: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 13273 1726853334.11129: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13273 1726853334.11141: getting variables 13273 1726853334.11142: in VariableManager get_vars() 13273 1726853334.11380: Calling all_inventory to load vars for managed_node3 13273 1726853334.11383: Calling groups_inventory to load vars for managed_node3 13273 1726853334.11386: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853334.11394: Calling all_plugins_play to load vars for managed_node3 13273 1726853334.11397: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853334.11400: Calling groups_plugins_play to load vars for managed_node3 13273 1726853334.12785: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853334.14493: done with get_vars() 13273 1726853334.14518: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 13:28:54 -0400 (0:00:00.944) 0:00:52.035 ****** 13273 1726853334.14610: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_state 13273 1726853334.15149: worker is 1 (out of 1 available) 13273 1726853334.15161: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_state 13273 1726853334.15174: done queuing things up, now waiting for results queue to drain 13273 1726853334.15176: waiting for pending results... 13273 1726853334.15493: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state 13273 1726853334.15589: in run() - task 02083763-bbaf-5fc3-657d-00000000017d 13273 1726853334.15608: variable 'ansible_search_path' from source: unknown 13273 1726853334.15614: variable 'ansible_search_path' from source: unknown 13273 1726853334.15655: calling self._execute() 13273 1726853334.15764: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853334.15780: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853334.15797: variable 'omit' from source: magic vars 13273 1726853334.16181: variable 'ansible_distribution_major_version' from source: facts 13273 1726853334.16198: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853334.16326: variable 'network_state' from source: role '' defaults 13273 1726853334.16344: Evaluated conditional (network_state != {}): False 13273 1726853334.16359: when evaluation is False, skipping this task 13273 1726853334.16367: _execute() done 13273 1726853334.16463: dumping result to json 13273 1726853334.16466: done dumping result, returning 13273 1726853334.16470: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state [02083763-bbaf-5fc3-657d-00000000017d] 13273 1726853334.16473: sending task result for task 02083763-bbaf-5fc3-657d-00000000017d 13273 1726853334.16539: done sending task result for task 02083763-bbaf-5fc3-657d-00000000017d 13273 1726853334.16542: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 13273 1726853334.16620: no more pending results, returning what we have 13273 1726853334.16625: results queue empty 13273 1726853334.16627: checking for any_errors_fatal 13273 1726853334.16636: done checking for any_errors_fatal 13273 1726853334.16637: checking for max_fail_percentage 13273 1726853334.16639: done checking for max_fail_percentage 13273 1726853334.16640: checking to see if all hosts have failed and the running result is not ok 13273 1726853334.16640: done checking to see if all hosts have failed 13273 1726853334.16641: getting the remaining hosts for this loop 13273 1726853334.16642: done getting the remaining hosts for this loop 13273 1726853334.16649: getting the next task for host managed_node3 13273 1726853334.16657: done getting next task for host managed_node3 13273 1726853334.16661: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 13273 1726853334.16666: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13273 1726853334.16693: getting variables 13273 1726853334.16694: in VariableManager get_vars() 13273 1726853334.16749: Calling all_inventory to load vars for managed_node3 13273 1726853334.16752: Calling groups_inventory to load vars for managed_node3 13273 1726853334.16756: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853334.16767: Calling all_plugins_play to load vars for managed_node3 13273 1726853334.16770: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853334.16978: Calling groups_plugins_play to load vars for managed_node3 13273 1726853334.19498: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853334.22304: done with get_vars() 13273 1726853334.22335: done getting variables 13273 1726853334.22496: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 13:28:54 -0400 (0:00:00.079) 0:00:52.114 ****** 13273 1726853334.22578: entering _queue_task() for managed_node3/debug 13273 1726853334.23066: worker is 1 (out of 1 available) 13273 1726853334.23082: exiting _queue_task() for managed_node3/debug 13273 1726853334.23094: done queuing things up, now waiting for results queue to drain 13273 1726853334.23096: waiting for pending results... 13273 1726853334.23490: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 13273 1726853334.23549: in run() - task 02083763-bbaf-5fc3-657d-00000000017e 13273 1726853334.23574: variable 'ansible_search_path' from source: unknown 13273 1726853334.23588: variable 'ansible_search_path' from source: unknown 13273 1726853334.23629: calling self._execute() 13273 1726853334.23741: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853334.23758: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853334.23775: variable 'omit' from source: magic vars 13273 1726853334.24201: variable 'ansible_distribution_major_version' from source: facts 13273 1726853334.24219: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853334.24234: variable 'omit' from source: magic vars 13273 1726853334.24304: variable 'omit' from source: magic vars 13273 1726853334.24345: variable 'omit' from source: magic vars 13273 1726853334.24393: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13273 1726853334.24433: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13273 1726853334.24460: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13273 1726853334.24485: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853334.24502: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853334.24536: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13273 1726853334.24545: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853334.24553: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853334.24655: Set connection var ansible_connection to ssh 13273 1726853334.24673: Set connection var ansible_module_compression to ZIP_DEFLATED 13273 1726853334.24876: Set connection var ansible_shell_executable to /bin/sh 13273 1726853334.24879: Set connection var ansible_shell_type to sh 13273 1726853334.24881: Set connection var ansible_pipelining to False 13273 1726853334.24883: Set connection var ansible_timeout to 10 13273 1726853334.24885: variable 'ansible_shell_executable' from source: unknown 13273 1726853334.24887: variable 'ansible_connection' from source: unknown 13273 1726853334.24889: variable 'ansible_module_compression' from source: unknown 13273 1726853334.24891: variable 'ansible_shell_type' from source: unknown 13273 1726853334.24892: variable 'ansible_shell_executable' from source: unknown 13273 1726853334.24894: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853334.24896: variable 'ansible_pipelining' from source: unknown 13273 1726853334.24897: variable 'ansible_timeout' from source: unknown 13273 1726853334.24899: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853334.24902: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13273 1726853334.24912: variable 'omit' from source: magic vars 13273 1726853334.24924: starting attempt loop 13273 1726853334.24931: running the handler 13273 1726853334.25066: variable '__network_connections_result' from source: set_fact 13273 1726853334.25125: handler run complete 13273 1726853334.25151: attempt loop complete, returning result 13273 1726853334.25348: _execute() done 13273 1726853334.25353: dumping result to json 13273 1726853334.25357: done dumping result, returning 13273 1726853334.25363: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [02083763-bbaf-5fc3-657d-00000000017e] 13273 1726853334.25365: sending task result for task 02083763-bbaf-5fc3-657d-00000000017e ok: [managed_node3] => { "__network_connections_result.stderr_lines": [ "" ] } 13273 1726853334.25541: no more pending results, returning what we have 13273 1726853334.25549: results queue empty 13273 1726853334.25550: checking for any_errors_fatal 13273 1726853334.25559: done checking for any_errors_fatal 13273 1726853334.25560: checking for max_fail_percentage 13273 1726853334.25562: done checking for max_fail_percentage 13273 1726853334.25563: checking to see if all hosts have failed and the running result is not ok 13273 1726853334.25564: done checking to see if all hosts have failed 13273 1726853334.25565: getting the remaining hosts for this loop 13273 1726853334.25566: done getting the remaining hosts for this loop 13273 1726853334.25569: getting the next task for host managed_node3 13273 1726853334.25582: done getting next task for host managed_node3 13273 1726853334.25586: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 13273 1726853334.25590: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13273 1726853334.25605: getting variables 13273 1726853334.25607: in VariableManager get_vars() 13273 1726853334.25672: Calling all_inventory to load vars for managed_node3 13273 1726853334.25957: Calling groups_inventory to load vars for managed_node3 13273 1726853334.25961: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853334.26063: done sending task result for task 02083763-bbaf-5fc3-657d-00000000017e 13273 1726853334.26066: WORKER PROCESS EXITING 13273 1726853334.26144: Calling all_plugins_play to load vars for managed_node3 13273 1726853334.26150: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853334.26154: Calling groups_plugins_play to load vars for managed_node3 13273 1726853334.29554: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853334.32042: done with get_vars() 13273 1726853334.32070: done getting variables 13273 1726853334.32139: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 13:28:54 -0400 (0:00:00.097) 0:00:52.212 ****** 13273 1726853334.32314: entering _queue_task() for managed_node3/debug 13273 1726853334.33146: worker is 1 (out of 1 available) 13273 1726853334.33168: exiting _queue_task() for managed_node3/debug 13273 1726853334.33198: done queuing things up, now waiting for results queue to drain 13273 1726853334.33199: waiting for pending results... 13273 1726853334.33377: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 13273 1726853334.33512: in run() - task 02083763-bbaf-5fc3-657d-00000000017f 13273 1726853334.33523: variable 'ansible_search_path' from source: unknown 13273 1726853334.33528: variable 'ansible_search_path' from source: unknown 13273 1726853334.33559: calling self._execute() 13273 1726853334.33643: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853334.33652: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853334.33660: variable 'omit' from source: magic vars 13273 1726853334.34041: variable 'ansible_distribution_major_version' from source: facts 13273 1726853334.34076: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853334.34079: variable 'omit' from source: magic vars 13273 1726853334.34104: variable 'omit' from source: magic vars 13273 1726853334.34131: variable 'omit' from source: magic vars 13273 1726853334.34167: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13273 1726853334.34211: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13273 1726853334.34257: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13273 1726853334.34274: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853334.34300: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853334.34344: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13273 1726853334.34351: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853334.34354: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853334.34526: Set connection var ansible_connection to ssh 13273 1726853334.34531: Set connection var ansible_module_compression to ZIP_DEFLATED 13273 1726853334.34537: Set connection var ansible_shell_executable to /bin/sh 13273 1726853334.34540: Set connection var ansible_shell_type to sh 13273 1726853334.34543: Set connection var ansible_pipelining to False 13273 1726853334.34564: Set connection var ansible_timeout to 10 13273 1726853334.34606: variable 'ansible_shell_executable' from source: unknown 13273 1726853334.34609: variable 'ansible_connection' from source: unknown 13273 1726853334.34612: variable 'ansible_module_compression' from source: unknown 13273 1726853334.34614: variable 'ansible_shell_type' from source: unknown 13273 1726853334.34616: variable 'ansible_shell_executable' from source: unknown 13273 1726853334.34618: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853334.34620: variable 'ansible_pipelining' from source: unknown 13273 1726853334.34622: variable 'ansible_timeout' from source: unknown 13273 1726853334.34625: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853334.34778: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13273 1726853334.34787: variable 'omit' from source: magic vars 13273 1726853334.34790: starting attempt loop 13273 1726853334.34797: running the handler 13273 1726853334.34864: variable '__network_connections_result' from source: set_fact 13273 1726853334.34905: variable '__network_connections_result' from source: set_fact 13273 1726853334.35022: handler run complete 13273 1726853334.35026: attempt loop complete, returning result 13273 1726853334.35038: _execute() done 13273 1726853334.35041: dumping result to json 13273 1726853334.35043: done dumping result, returning 13273 1726853334.35074: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [02083763-bbaf-5fc3-657d-00000000017f] 13273 1726853334.35080: sending task result for task 02083763-bbaf-5fc3-657d-00000000017f 13273 1726853334.35141: done sending task result for task 02083763-bbaf-5fc3-657d-00000000017f 13273 1726853334.35144: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "bond0", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 13273 1726853334.35265: no more pending results, returning what we have 13273 1726853334.35277: results queue empty 13273 1726853334.35278: checking for any_errors_fatal 13273 1726853334.35284: done checking for any_errors_fatal 13273 1726853334.35285: checking for max_fail_percentage 13273 1726853334.35286: done checking for max_fail_percentage 13273 1726853334.35287: checking to see if all hosts have failed and the running result is not ok 13273 1726853334.35288: done checking to see if all hosts have failed 13273 1726853334.35288: getting the remaining hosts for this loop 13273 1726853334.35289: done getting the remaining hosts for this loop 13273 1726853334.35292: getting the next task for host managed_node3 13273 1726853334.35298: done getting next task for host managed_node3 13273 1726853334.35301: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 13273 1726853334.35305: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13273 1726853334.35343: getting variables 13273 1726853334.35346: in VariableManager get_vars() 13273 1726853334.35397: Calling all_inventory to load vars for managed_node3 13273 1726853334.35400: Calling groups_inventory to load vars for managed_node3 13273 1726853334.35402: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853334.35409: Calling all_plugins_play to load vars for managed_node3 13273 1726853334.35412: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853334.35414: Calling groups_plugins_play to load vars for managed_node3 13273 1726853334.37336: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853334.38618: done with get_vars() 13273 1726853334.38645: done getting variables 13273 1726853334.38710: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 13:28:54 -0400 (0:00:00.064) 0:00:52.276 ****** 13273 1726853334.38744: entering _queue_task() for managed_node3/debug 13273 1726853334.39089: worker is 1 (out of 1 available) 13273 1726853334.39100: exiting _queue_task() for managed_node3/debug 13273 1726853334.39224: done queuing things up, now waiting for results queue to drain 13273 1726853334.39226: waiting for pending results... 13273 1726853334.39451: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 13273 1726853334.39595: in run() - task 02083763-bbaf-5fc3-657d-000000000180 13273 1726853334.39656: variable 'ansible_search_path' from source: unknown 13273 1726853334.39660: variable 'ansible_search_path' from source: unknown 13273 1726853334.39662: calling self._execute() 13273 1726853334.39807: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853334.39825: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853334.39840: variable 'omit' from source: magic vars 13273 1726853334.40785: variable 'ansible_distribution_major_version' from source: facts 13273 1726853334.40788: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853334.40976: variable 'network_state' from source: role '' defaults 13273 1726853334.40980: Evaluated conditional (network_state != {}): False 13273 1726853334.40982: when evaluation is False, skipping this task 13273 1726853334.40985: _execute() done 13273 1726853334.40987: dumping result to json 13273 1726853334.40989: done dumping result, returning 13273 1726853334.40992: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [02083763-bbaf-5fc3-657d-000000000180] 13273 1726853334.41221: sending task result for task 02083763-bbaf-5fc3-657d-000000000180 13273 1726853334.41301: done sending task result for task 02083763-bbaf-5fc3-657d-000000000180 13273 1726853334.41304: WORKER PROCESS EXITING skipping: [managed_node3] => { "false_condition": "network_state != {}" } 13273 1726853334.41374: no more pending results, returning what we have 13273 1726853334.41378: results queue empty 13273 1726853334.41379: checking for any_errors_fatal 13273 1726853334.41390: done checking for any_errors_fatal 13273 1726853334.41390: checking for max_fail_percentage 13273 1726853334.41393: done checking for max_fail_percentage 13273 1726853334.41394: checking to see if all hosts have failed and the running result is not ok 13273 1726853334.41395: done checking to see if all hosts have failed 13273 1726853334.41396: getting the remaining hosts for this loop 13273 1726853334.41398: done getting the remaining hosts for this loop 13273 1726853334.41401: getting the next task for host managed_node3 13273 1726853334.41409: done getting next task for host managed_node3 13273 1726853334.41420: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 13273 1726853334.41427: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13273 1726853334.41455: getting variables 13273 1726853334.41457: in VariableManager get_vars() 13273 1726853334.41513: Calling all_inventory to load vars for managed_node3 13273 1726853334.41517: Calling groups_inventory to load vars for managed_node3 13273 1726853334.41520: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853334.41756: Calling all_plugins_play to load vars for managed_node3 13273 1726853334.41760: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853334.41763: Calling groups_plugins_play to load vars for managed_node3 13273 1726853334.43596: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853334.45409: done with get_vars() 13273 1726853334.45441: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 13:28:54 -0400 (0:00:00.068) 0:00:52.344 ****** 13273 1726853334.45555: entering _queue_task() for managed_node3/ping 13273 1726853334.45980: worker is 1 (out of 1 available) 13273 1726853334.45992: exiting _queue_task() for managed_node3/ping 13273 1726853334.46002: done queuing things up, now waiting for results queue to drain 13273 1726853334.46003: waiting for pending results... 13273 1726853334.46376: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 13273 1726853334.46440: in run() - task 02083763-bbaf-5fc3-657d-000000000181 13273 1726853334.46464: variable 'ansible_search_path' from source: unknown 13273 1726853334.46475: variable 'ansible_search_path' from source: unknown 13273 1726853334.46526: calling self._execute() 13273 1726853334.46776: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853334.46780: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853334.46782: variable 'omit' from source: magic vars 13273 1726853334.47258: variable 'ansible_distribution_major_version' from source: facts 13273 1726853334.47278: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853334.47290: variable 'omit' from source: magic vars 13273 1726853334.47369: variable 'omit' from source: magic vars 13273 1726853334.47416: variable 'omit' from source: magic vars 13273 1726853334.47470: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13273 1726853334.47514: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13273 1726853334.47539: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13273 1726853334.47593: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853334.47669: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853334.47675: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13273 1726853334.47678: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853334.47680: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853334.47779: Set connection var ansible_connection to ssh 13273 1726853334.47797: Set connection var ansible_module_compression to ZIP_DEFLATED 13273 1726853334.47808: Set connection var ansible_shell_executable to /bin/sh 13273 1726853334.47815: Set connection var ansible_shell_type to sh 13273 1726853334.47827: Set connection var ansible_pipelining to False 13273 1726853334.47838: Set connection var ansible_timeout to 10 13273 1726853334.47877: variable 'ansible_shell_executable' from source: unknown 13273 1726853334.47976: variable 'ansible_connection' from source: unknown 13273 1726853334.47981: variable 'ansible_module_compression' from source: unknown 13273 1726853334.47983: variable 'ansible_shell_type' from source: unknown 13273 1726853334.47985: variable 'ansible_shell_executable' from source: unknown 13273 1726853334.47988: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853334.47991: variable 'ansible_pipelining' from source: unknown 13273 1726853334.47993: variable 'ansible_timeout' from source: unknown 13273 1726853334.47995: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853334.48149: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 13273 1726853334.48175: variable 'omit' from source: magic vars 13273 1726853334.48179: starting attempt loop 13273 1726853334.48224: running the handler 13273 1726853334.48227: _low_level_execute_command(): starting 13273 1726853334.48230: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13273 1726853334.49219: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853334.49257: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853334.49281: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853334.49334: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853334.49496: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853334.51221: stdout chunk (state=3): >>>/root <<< 13273 1726853334.51522: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853334.51530: stdout chunk (state=3): >>><<< 13273 1726853334.51532: stderr chunk (state=3): >>><<< 13273 1726853334.51912: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853334.51915: _low_level_execute_command(): starting 13273 1726853334.51918: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853334.5175576-15673-25419394028788 `" && echo ansible-tmp-1726853334.5175576-15673-25419394028788="` echo /root/.ansible/tmp/ansible-tmp-1726853334.5175576-15673-25419394028788 `" ) && sleep 0' 13273 1726853334.52554: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853334.52605: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853334.52626: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853334.52744: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853334.52807: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853334.54816: stdout chunk (state=3): >>>ansible-tmp-1726853334.5175576-15673-25419394028788=/root/.ansible/tmp/ansible-tmp-1726853334.5175576-15673-25419394028788 <<< 13273 1726853334.54929: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853334.54960: stderr chunk (state=3): >>><<< 13273 1726853334.54968: stdout chunk (state=3): >>><<< 13273 1726853334.55065: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853334.5175576-15673-25419394028788=/root/.ansible/tmp/ansible-tmp-1726853334.5175576-15673-25419394028788 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853334.55069: variable 'ansible_module_compression' from source: unknown 13273 1726853334.55083: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13273pukop8ph/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 13273 1726853334.55119: variable 'ansible_facts' from source: unknown 13273 1726853334.55206: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853334.5175576-15673-25419394028788/AnsiballZ_ping.py 13273 1726853334.55393: Sending initial data 13273 1726853334.55397: Sent initial data (152 bytes) 13273 1726853334.55967: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13273 1726853334.55974: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 13273 1726853334.55977: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853334.55984: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853334.56060: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853334.56107: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853334.56191: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853334.58039: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 debug2: Sending SSH2_FXP_REALPATH "." <<< 13273 1726853334.58043: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13273pukop8ph/tmp90596dai /root/.ansible/tmp/ansible-tmp-1726853334.5175576-15673-25419394028788/AnsiballZ_ping.py <<< 13273 1726853334.58048: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853334.5175576-15673-25419394028788/AnsiballZ_ping.py" <<< 13273 1726853334.58095: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13273pukop8ph/tmp90596dai" to remote "/root/.ansible/tmp/ansible-tmp-1726853334.5175576-15673-25419394028788/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853334.5175576-15673-25419394028788/AnsiballZ_ping.py" <<< 13273 1726853334.59445: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853334.59607: stderr chunk (state=3): >>><<< 13273 1726853334.59611: stdout chunk (state=3): >>><<< 13273 1726853334.59621: done transferring module to remote 13273 1726853334.59637: _low_level_execute_command(): starting 13273 1726853334.59649: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853334.5175576-15673-25419394028788/ /root/.ansible/tmp/ansible-tmp-1726853334.5175576-15673-25419394028788/AnsiballZ_ping.py && sleep 0' 13273 1726853334.61050: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853334.61129: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853334.61311: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853334.63499: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853334.63502: stdout chunk (state=3): >>><<< 13273 1726853334.63509: stderr chunk (state=3): >>><<< 13273 1726853334.63526: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853334.63534: _low_level_execute_command(): starting 13273 1726853334.63537: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853334.5175576-15673-25419394028788/AnsiballZ_ping.py && sleep 0' 13273 1726853334.64980: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853334.64983: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853334.65016: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853334.65019: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853334.65144: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853334.80877: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 13273 1726853334.82084: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 13273 1726853334.82095: stdout chunk (state=3): >>><<< 13273 1726853334.82214: stderr chunk (state=3): >>><<< 13273 1726853334.82218: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 13273 1726853334.82221: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853334.5175576-15673-25419394028788/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13273 1726853334.82225: _low_level_execute_command(): starting 13273 1726853334.82227: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853334.5175576-15673-25419394028788/ > /dev/null 2>&1 && sleep 0' 13273 1726853334.83430: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13273 1726853334.83593: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853334.83670: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853334.83692: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853334.83709: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853334.83730: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853334.83905: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853334.85828: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853334.85832: stdout chunk (state=3): >>><<< 13273 1726853334.85835: stderr chunk (state=3): >>><<< 13273 1726853334.86048: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853334.86052: handler run complete 13273 1726853334.86055: attempt loop complete, returning result 13273 1726853334.86057: _execute() done 13273 1726853334.86059: dumping result to json 13273 1726853334.86061: done dumping result, returning 13273 1726853334.86063: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [02083763-bbaf-5fc3-657d-000000000181] 13273 1726853334.86065: sending task result for task 02083763-bbaf-5fc3-657d-000000000181 ok: [managed_node3] => { "changed": false, "ping": "pong" } 13273 1726853334.86243: no more pending results, returning what we have 13273 1726853334.86246: results queue empty 13273 1726853334.86247: checking for any_errors_fatal 13273 1726853334.86256: done checking for any_errors_fatal 13273 1726853334.86257: checking for max_fail_percentage 13273 1726853334.86277: done checking for max_fail_percentage 13273 1726853334.86278: checking to see if all hosts have failed and the running result is not ok 13273 1726853334.86279: done checking to see if all hosts have failed 13273 1726853334.86279: getting the remaining hosts for this loop 13273 1726853334.86281: done getting the remaining hosts for this loop 13273 1726853334.86284: getting the next task for host managed_node3 13273 1726853334.86295: done getting next task for host managed_node3 13273 1726853334.86298: ^ task is: TASK: meta (role_complete) 13273 1726853334.86302: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13273 1726853334.86317: getting variables 13273 1726853334.86318: in VariableManager get_vars() 13273 1726853334.86778: Calling all_inventory to load vars for managed_node3 13273 1726853334.86782: Calling groups_inventory to load vars for managed_node3 13273 1726853334.86785: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853334.86794: Calling all_plugins_play to load vars for managed_node3 13273 1726853334.86799: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853334.86801: Calling groups_plugins_play to load vars for managed_node3 13273 1726853334.87387: done sending task result for task 02083763-bbaf-5fc3-657d-000000000181 13273 1726853334.87390: WORKER PROCESS EXITING 13273 1726853334.89424: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853334.91185: done with get_vars() 13273 1726853334.91209: done getting variables 13273 1726853334.91402: done queuing things up, now waiting for results queue to drain 13273 1726853334.91405: results queue empty 13273 1726853334.91405: checking for any_errors_fatal 13273 1726853334.91408: done checking for any_errors_fatal 13273 1726853334.91409: checking for max_fail_percentage 13273 1726853334.91410: done checking for max_fail_percentage 13273 1726853334.91411: checking to see if all hosts have failed and the running result is not ok 13273 1726853334.91411: done checking to see if all hosts have failed 13273 1726853334.91412: getting the remaining hosts for this loop 13273 1726853334.91413: done getting the remaining hosts for this loop 13273 1726853334.91416: getting the next task for host managed_node3 13273 1726853334.91420: done getting next task for host managed_node3 13273 1726853334.91424: ^ task is: TASK: Delete the device '{{ controller_device }}' 13273 1726853334.91426: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13273 1726853334.91429: getting variables 13273 1726853334.91430: in VariableManager get_vars() 13273 1726853334.91450: Calling all_inventory to load vars for managed_node3 13273 1726853334.91453: Calling groups_inventory to load vars for managed_node3 13273 1726853334.91455: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853334.91460: Calling all_plugins_play to load vars for managed_node3 13273 1726853334.91462: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853334.91465: Calling groups_plugins_play to load vars for managed_node3 13273 1726853334.93309: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853335.00186: done with get_vars() 13273 1726853335.00203: done getting variables 13273 1726853335.00235: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 13273 1726853335.00306: variable 'controller_device' from source: play vars TASK [Delete the device 'nm-bond'] ********************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:242 Friday 20 September 2024 13:28:55 -0400 (0:00:00.547) 0:00:52.892 ****** 13273 1726853335.00324: entering _queue_task() for managed_node3/command 13273 1726853335.00668: worker is 1 (out of 1 available) 13273 1726853335.00682: exiting _queue_task() for managed_node3/command 13273 1726853335.00695: done queuing things up, now waiting for results queue to drain 13273 1726853335.00697: waiting for pending results... 13273 1726853335.00910: running TaskExecutor() for managed_node3/TASK: Delete the device 'nm-bond' 13273 1726853335.01007: in run() - task 02083763-bbaf-5fc3-657d-0000000001b1 13273 1726853335.01029: variable 'ansible_search_path' from source: unknown 13273 1726853335.01062: calling self._execute() 13273 1726853335.01173: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853335.01178: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853335.01187: variable 'omit' from source: magic vars 13273 1726853335.01599: variable 'ansible_distribution_major_version' from source: facts 13273 1726853335.01619: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853335.01629: variable 'omit' from source: magic vars 13273 1726853335.01645: variable 'omit' from source: magic vars 13273 1726853335.01775: variable 'controller_device' from source: play vars 13273 1726853335.01795: variable 'omit' from source: magic vars 13273 1726853335.01837: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13273 1726853335.01876: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13273 1726853335.01892: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13273 1726853335.01912: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853335.01920: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853335.01952: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13273 1726853335.01955: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853335.01958: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853335.02032: Set connection var ansible_connection to ssh 13273 1726853335.02056: Set connection var ansible_module_compression to ZIP_DEFLATED 13273 1726853335.02059: Set connection var ansible_shell_executable to /bin/sh 13273 1726853335.02062: Set connection var ansible_shell_type to sh 13273 1726853335.02065: Set connection var ansible_pipelining to False 13273 1726853335.02067: Set connection var ansible_timeout to 10 13273 1726853335.02147: variable 'ansible_shell_executable' from source: unknown 13273 1726853335.02154: variable 'ansible_connection' from source: unknown 13273 1726853335.02157: variable 'ansible_module_compression' from source: unknown 13273 1726853335.02160: variable 'ansible_shell_type' from source: unknown 13273 1726853335.02162: variable 'ansible_shell_executable' from source: unknown 13273 1726853335.02164: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853335.02167: variable 'ansible_pipelining' from source: unknown 13273 1726853335.02175: variable 'ansible_timeout' from source: unknown 13273 1726853335.02178: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853335.02333: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13273 1726853335.02340: variable 'omit' from source: magic vars 13273 1726853335.02343: starting attempt loop 13273 1726853335.02345: running the handler 13273 1726853335.02347: _low_level_execute_command(): starting 13273 1726853335.02351: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13273 1726853335.03286: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13273 1726853335.03290: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853335.03563: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13273 1726853335.03590: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 13273 1726853335.03615: stderr chunk (state=3): >>>debug2: match not found <<< 13273 1726853335.03618: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853335.03629: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13273 1726853335.03638: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address <<< 13273 1726853335.03702: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13273 1726853335.03706: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853335.03708: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853335.03726: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13273 1726853335.03729: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 13273 1726853335.03732: stderr chunk (state=3): >>>debug2: match found <<< 13273 1726853335.03734: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853335.03823: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853335.03826: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853335.03829: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853335.03910: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853335.05635: stdout chunk (state=3): >>>/root <<< 13273 1726853335.05742: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853335.05786: stderr chunk (state=3): >>><<< 13273 1726853335.05790: stdout chunk (state=3): >>><<< 13273 1726853335.05826: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853335.05843: _low_level_execute_command(): starting 13273 1726853335.05852: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853335.0581567-15710-89371006887833 `" && echo ansible-tmp-1726853335.0581567-15710-89371006887833="` echo /root/.ansible/tmp/ansible-tmp-1726853335.0581567-15710-89371006887833 `" ) && sleep 0' 13273 1726853335.06362: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853335.06406: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13273 1726853335.06410: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853335.06413: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853335.06416: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853335.06467: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853335.06473: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853335.06540: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853335.08500: stdout chunk (state=3): >>>ansible-tmp-1726853335.0581567-15710-89371006887833=/root/.ansible/tmp/ansible-tmp-1726853335.0581567-15710-89371006887833 <<< 13273 1726853335.08608: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853335.08636: stderr chunk (state=3): >>><<< 13273 1726853335.08640: stdout chunk (state=3): >>><<< 13273 1726853335.08656: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853335.0581567-15710-89371006887833=/root/.ansible/tmp/ansible-tmp-1726853335.0581567-15710-89371006887833 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853335.08683: variable 'ansible_module_compression' from source: unknown 13273 1726853335.08724: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13273pukop8ph/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 13273 1726853335.08760: variable 'ansible_facts' from source: unknown 13273 1726853335.08817: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853335.0581567-15710-89371006887833/AnsiballZ_command.py 13273 1726853335.08915: Sending initial data 13273 1726853335.08919: Sent initial data (155 bytes) 13273 1726853335.09353: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853335.09356: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 13273 1726853335.09358: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 13273 1726853335.09361: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853335.09363: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853335.09414: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853335.09417: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853335.09483: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853335.11118: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 13273 1726853335.11121: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13273 1726853335.11175: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13273 1726853335.11254: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13273pukop8ph/tmpx3a24tqz /root/.ansible/tmp/ansible-tmp-1726853335.0581567-15710-89371006887833/AnsiballZ_command.py <<< 13273 1726853335.11276: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853335.0581567-15710-89371006887833/AnsiballZ_command.py" <<< 13273 1726853335.11330: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory <<< 13273 1726853335.11338: stderr chunk (state=3): >>>debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13273pukop8ph/tmpx3a24tqz" to remote "/root/.ansible/tmp/ansible-tmp-1726853335.0581567-15710-89371006887833/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853335.0581567-15710-89371006887833/AnsiballZ_command.py" <<< 13273 1726853335.12051: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853335.12111: stderr chunk (state=3): >>><<< 13273 1726853335.12114: stdout chunk (state=3): >>><<< 13273 1726853335.12122: done transferring module to remote 13273 1726853335.12132: _low_level_execute_command(): starting 13273 1726853335.12144: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853335.0581567-15710-89371006887833/ /root/.ansible/tmp/ansible-tmp-1726853335.0581567-15710-89371006887833/AnsiballZ_command.py && sleep 0' 13273 1726853335.12697: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853335.12705: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853335.12708: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853335.12720: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853335.12732: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853335.12801: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853335.14666: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853335.14688: stderr chunk (state=3): >>><<< 13273 1726853335.14692: stdout chunk (state=3): >>><<< 13273 1726853335.14703: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853335.14706: _low_level_execute_command(): starting 13273 1726853335.14712: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853335.0581567-15710-89371006887833/AnsiballZ_command.py && sleep 0' 13273 1726853335.15186: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853335.15190: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853335.15192: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853335.15249: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853335.15253: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853335.15311: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853335.31649: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "Cannot find device \"nm-bond\"", "rc": 1, "cmd": ["ip", "link", "del", "nm-bond"], "start": "2024-09-20 13:28:55.307394", "end": "2024-09-20 13:28:55.314959", "delta": "0:00:00.007565", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "ip link del nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 13273 1726853335.33596: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.11.217 closed. <<< 13273 1726853335.33600: stdout chunk (state=3): >>><<< 13273 1726853335.33603: stderr chunk (state=3): >>><<< 13273 1726853335.33606: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "Cannot find device \"nm-bond\"", "rc": 1, "cmd": ["ip", "link", "del", "nm-bond"], "start": "2024-09-20 13:28:55.307394", "end": "2024-09-20 13:28:55.314959", "delta": "0:00:00.007565", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "ip link del nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.11.217 closed. 13273 1726853335.33751: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link del nm-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853335.0581567-15710-89371006887833/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13273 1726853335.33755: _low_level_execute_command(): starting 13273 1726853335.33758: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853335.0581567-15710-89371006887833/ > /dev/null 2>&1 && sleep 0' 13273 1726853335.34699: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13273 1726853335.34704: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853335.34709: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853335.34711: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13273 1726853335.34713: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 13273 1726853335.34716: stderr chunk (state=3): >>>debug2: match not found <<< 13273 1726853335.34718: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853335.34720: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853335.34723: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853335.34781: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853335.36807: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853335.36811: stdout chunk (state=3): >>><<< 13273 1726853335.36814: stderr chunk (state=3): >>><<< 13273 1726853335.36951: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853335.36954: handler run complete 13273 1726853335.36984: Evaluated conditional (False): False 13273 1726853335.36994: Evaluated conditional (False): False 13273 1726853335.36998: attempt loop complete, returning result 13273 1726853335.37001: _execute() done 13273 1726853335.37003: dumping result to json 13273 1726853335.37005: done dumping result, returning 13273 1726853335.37014: done running TaskExecutor() for managed_node3/TASK: Delete the device 'nm-bond' [02083763-bbaf-5fc3-657d-0000000001b1] 13273 1726853335.37017: sending task result for task 02083763-bbaf-5fc3-657d-0000000001b1 13273 1726853335.37323: done sending task result for task 02083763-bbaf-5fc3-657d-0000000001b1 13273 1726853335.37325: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": [ "ip", "link", "del", "nm-bond" ], "delta": "0:00:00.007565", "end": "2024-09-20 13:28:55.314959", "failed_when_result": false, "rc": 1, "start": "2024-09-20 13:28:55.307394" } STDERR: Cannot find device "nm-bond" MSG: non-zero return code 13273 1726853335.37396: no more pending results, returning what we have 13273 1726853335.37400: results queue empty 13273 1726853335.37401: checking for any_errors_fatal 13273 1726853335.37402: done checking for any_errors_fatal 13273 1726853335.37403: checking for max_fail_percentage 13273 1726853335.37405: done checking for max_fail_percentage 13273 1726853335.37405: checking to see if all hosts have failed and the running result is not ok 13273 1726853335.37406: done checking to see if all hosts have failed 13273 1726853335.37407: getting the remaining hosts for this loop 13273 1726853335.37408: done getting the remaining hosts for this loop 13273 1726853335.37411: getting the next task for host managed_node3 13273 1726853335.37425: done getting next task for host managed_node3 13273 1726853335.37428: ^ task is: TASK: Remove test interfaces 13273 1726853335.37432: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13273 1726853335.37440: getting variables 13273 1726853335.37442: in VariableManager get_vars() 13273 1726853335.37497: Calling all_inventory to load vars for managed_node3 13273 1726853335.37501: Calling groups_inventory to load vars for managed_node3 13273 1726853335.37504: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853335.37514: Calling all_plugins_play to load vars for managed_node3 13273 1726853335.37517: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853335.37520: Calling groups_plugins_play to load vars for managed_node3 13273 1726853335.40894: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853335.45466: done with get_vars() 13273 1726853335.45599: done getting variables 13273 1726853335.45687: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Remove test interfaces] ************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml:3 Friday 20 September 2024 13:28:55 -0400 (0:00:00.453) 0:00:53.346 ****** 13273 1726853335.45722: entering _queue_task() for managed_node3/shell 13273 1726853335.46128: worker is 1 (out of 1 available) 13273 1726853335.46141: exiting _queue_task() for managed_node3/shell 13273 1726853335.46156: done queuing things up, now waiting for results queue to drain 13273 1726853335.46157: waiting for pending results... 13273 1726853335.46502: running TaskExecutor() for managed_node3/TASK: Remove test interfaces 13273 1726853335.46723: in run() - task 02083763-bbaf-5fc3-657d-0000000001b5 13273 1726853335.46727: variable 'ansible_search_path' from source: unknown 13273 1726853335.46730: variable 'ansible_search_path' from source: unknown 13273 1726853335.46733: calling self._execute() 13273 1726853335.46807: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853335.46820: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853335.46840: variable 'omit' from source: magic vars 13273 1726853335.47278: variable 'ansible_distribution_major_version' from source: facts 13273 1726853335.47378: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853335.47383: variable 'omit' from source: magic vars 13273 1726853335.47386: variable 'omit' from source: magic vars 13273 1726853335.47729: variable 'dhcp_interface1' from source: play vars 13273 1726853335.47739: variable 'dhcp_interface2' from source: play vars 13273 1726853335.47811: variable 'omit' from source: magic vars 13273 1726853335.47814: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13273 1726853335.47858: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13273 1726853335.47885: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13273 1726853335.47907: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853335.47948: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853335.48138: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13273 1726853335.48141: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853335.48144: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853335.48282: Set connection var ansible_connection to ssh 13273 1726853335.48298: Set connection var ansible_module_compression to ZIP_DEFLATED 13273 1726853335.48307: Set connection var ansible_shell_executable to /bin/sh 13273 1726853335.48379: Set connection var ansible_shell_type to sh 13273 1726853335.48381: Set connection var ansible_pipelining to False 13273 1726853335.48384: Set connection var ansible_timeout to 10 13273 1726853335.48412: variable 'ansible_shell_executable' from source: unknown 13273 1726853335.48681: variable 'ansible_connection' from source: unknown 13273 1726853335.48684: variable 'ansible_module_compression' from source: unknown 13273 1726853335.48687: variable 'ansible_shell_type' from source: unknown 13273 1726853335.48689: variable 'ansible_shell_executable' from source: unknown 13273 1726853335.48691: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853335.48693: variable 'ansible_pipelining' from source: unknown 13273 1726853335.48695: variable 'ansible_timeout' from source: unknown 13273 1726853335.48697: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853335.48900: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13273 1726853335.48919: variable 'omit' from source: magic vars 13273 1726853335.48930: starting attempt loop 13273 1726853335.48937: running the handler 13273 1726853335.48954: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13273 1726853335.48980: _low_level_execute_command(): starting 13273 1726853335.48994: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13273 1726853335.50022: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13273 1726853335.50068: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853335.50172: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853335.50205: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853335.50293: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853335.52036: stdout chunk (state=3): >>>/root <<< 13273 1726853335.52620: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853335.52625: stdout chunk (state=3): >>><<< 13273 1726853335.52629: stderr chunk (state=3): >>><<< 13273 1726853335.53178: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853335.53183: _low_level_execute_command(): starting 13273 1726853335.53188: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853335.5273433-15736-146622109585585 `" && echo ansible-tmp-1726853335.5273433-15736-146622109585585="` echo /root/.ansible/tmp/ansible-tmp-1726853335.5273433-15736-146622109585585 `" ) && sleep 0' 13273 1726853335.53906: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13273 1726853335.53986: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853335.53997: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13273 1726853335.54005: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address <<< 13273 1726853335.54013: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13273 1726853335.54020: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853335.54084: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853335.54295: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853335.54299: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853335.54400: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853335.56479: stdout chunk (state=3): >>>ansible-tmp-1726853335.5273433-15736-146622109585585=/root/.ansible/tmp/ansible-tmp-1726853335.5273433-15736-146622109585585 <<< 13273 1726853335.56693: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853335.56696: stdout chunk (state=3): >>><<< 13273 1726853335.56699: stderr chunk (state=3): >>><<< 13273 1726853335.56701: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853335.5273433-15736-146622109585585=/root/.ansible/tmp/ansible-tmp-1726853335.5273433-15736-146622109585585 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853335.56703: variable 'ansible_module_compression' from source: unknown 13273 1726853335.56762: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13273pukop8ph/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 13273 1726853335.56765: variable 'ansible_facts' from source: unknown 13273 1726853335.57142: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853335.5273433-15736-146622109585585/AnsiballZ_command.py 13273 1726853335.57520: Sending initial data 13273 1726853335.57523: Sent initial data (156 bytes) 13273 1726853335.59222: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853335.59226: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13273 1726853335.59228: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 13273 1726853335.59230: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 13273 1726853335.59232: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853335.59241: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 13273 1726853335.59243: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853335.59384: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853335.59410: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853335.59485: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853335.61214: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 debug2: Sending SSH2_FXP_REALPATH "." <<< 13273 1726853335.61267: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13273pukop8ph/tmpntxdhzyg /root/.ansible/tmp/ansible-tmp-1726853335.5273433-15736-146622109585585/AnsiballZ_command.py <<< 13273 1726853335.61278: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853335.5273433-15736-146622109585585/AnsiballZ_command.py" <<< 13273 1726853335.61328: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13273pukop8ph/tmpntxdhzyg" to remote "/root/.ansible/tmp/ansible-tmp-1726853335.5273433-15736-146622109585585/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853335.5273433-15736-146622109585585/AnsiballZ_command.py" <<< 13273 1726853335.63087: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853335.63091: stdout chunk (state=3): >>><<< 13273 1726853335.63096: stderr chunk (state=3): >>><<< 13273 1726853335.63151: done transferring module to remote 13273 1726853335.63155: _low_level_execute_command(): starting 13273 1726853335.63158: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853335.5273433-15736-146622109585585/ /root/.ansible/tmp/ansible-tmp-1726853335.5273433-15736-146622109585585/AnsiballZ_command.py && sleep 0' 13273 1726853335.64618: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13273 1726853335.64677: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 13273 1726853335.64681: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853335.64688: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 13273 1726853335.64690: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853335.64838: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853335.64841: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853335.64884: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853335.65376: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853335.66986: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853335.66990: stdout chunk (state=3): >>><<< 13273 1726853335.66996: stderr chunk (state=3): >>><<< 13273 1726853335.67014: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853335.67017: _low_level_execute_command(): starting 13273 1726853335.67023: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853335.5273433-15736-146622109585585/AnsiballZ_command.py && sleep 0' 13273 1726853335.68477: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853335.68481: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 13273 1726853335.68886: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853335.68937: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853335.69030: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853335.88657: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "+ exec\n+ rc=0\n+ ip link delete test1\n+ '[' 0 '!=' 0 ']'\n+ ip link delete test2\n+ '[' 0 '!=' 0 ']'\n+ ip link delete testbr\n+ '[' 0 '!=' 0 ']'", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "start": "2024-09-20 13:28:55.845764", "end": "2024-09-20 13:28:55.885255", "delta": "0:00:00.039491", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 13273 1726853335.90438: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 13273 1726853335.90489: stdout chunk (state=3): >>><<< 13273 1726853335.90550: stderr chunk (state=3): >>><<< 13273 1726853335.90607: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "+ exec\n+ rc=0\n+ ip link delete test1\n+ '[' 0 '!=' 0 ']'\n+ ip link delete test2\n+ '[' 0 '!=' 0 ']'\n+ ip link delete testbr\n+ '[' 0 '!=' 0 ']'", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "start": "2024-09-20 13:28:55.845764", "end": "2024-09-20 13:28:55.885255", "delta": "0:00:00.039491", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 13273 1726853335.90852: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc="$?"\nif [ "$rc" != 0 ]; then\n echo ERROR - could not delete link test1 - error "$rc"\nfi\nip link delete test2 || rc="$?"\nif [ "$rc" != 0 ]; then\n echo ERROR - could not delete link test2 - error "$rc"\nfi\nip link delete testbr || rc="$?"\nif [ "$rc" != 0 ]; then\n echo ERROR - could not delete link testbr - error "$rc"\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853335.5273433-15736-146622109585585/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13273 1726853335.90855: _low_level_execute_command(): starting 13273 1726853335.90857: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853335.5273433-15736-146622109585585/ > /dev/null 2>&1 && sleep 0' 13273 1726853335.91413: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13273 1726853335.91428: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853335.91441: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853335.91527: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853335.91566: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853335.91589: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853335.91623: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853335.91757: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853335.93699: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853335.93703: stdout chunk (state=3): >>><<< 13273 1726853335.93707: stderr chunk (state=3): >>><<< 13273 1726853335.93776: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853335.93781: handler run complete 13273 1726853335.93784: Evaluated conditional (False): False 13273 1726853335.93786: attempt loop complete, returning result 13273 1726853335.93788: _execute() done 13273 1726853335.93792: dumping result to json 13273 1726853335.93794: done dumping result, returning 13273 1726853335.93796: done running TaskExecutor() for managed_node3/TASK: Remove test interfaces [02083763-bbaf-5fc3-657d-0000000001b5] 13273 1726853335.93798: sending task result for task 02083763-bbaf-5fc3-657d-0000000001b5 13273 1726853335.93866: done sending task result for task 02083763-bbaf-5fc3-657d-0000000001b5 13273 1726853335.93869: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "delta": "0:00:00.039491", "end": "2024-09-20 13:28:55.885255", "rc": 0, "start": "2024-09-20 13:28:55.845764" } STDERR: + exec + rc=0 + ip link delete test1 + '[' 0 '!=' 0 ']' + ip link delete test2 + '[' 0 '!=' 0 ']' + ip link delete testbr + '[' 0 '!=' 0 ']' 13273 1726853335.93930: no more pending results, returning what we have 13273 1726853335.93933: results queue empty 13273 1726853335.93934: checking for any_errors_fatal 13273 1726853335.93942: done checking for any_errors_fatal 13273 1726853335.93943: checking for max_fail_percentage 13273 1726853335.93945: done checking for max_fail_percentage 13273 1726853335.93948: checking to see if all hosts have failed and the running result is not ok 13273 1726853335.93949: done checking to see if all hosts have failed 13273 1726853335.93949: getting the remaining hosts for this loop 13273 1726853335.93951: done getting the remaining hosts for this loop 13273 1726853335.93954: getting the next task for host managed_node3 13273 1726853335.93960: done getting next task for host managed_node3 13273 1726853335.93963: ^ task is: TASK: Stop dnsmasq/radvd services 13273 1726853335.93966: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13273 1726853335.93972: getting variables 13273 1726853335.93974: in VariableManager get_vars() 13273 1726853335.94023: Calling all_inventory to load vars for managed_node3 13273 1726853335.94026: Calling groups_inventory to load vars for managed_node3 13273 1726853335.94028: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853335.94037: Calling all_plugins_play to load vars for managed_node3 13273 1726853335.94040: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853335.94048: Calling groups_plugins_play to load vars for managed_node3 13273 1726853335.95027: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853335.96773: done with get_vars() 13273 1726853335.96795: done getting variables 13273 1726853335.96855: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Stop dnsmasq/radvd services] ********************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml:23 Friday 20 September 2024 13:28:55 -0400 (0:00:00.511) 0:00:53.858 ****** 13273 1726853335.96903: entering _queue_task() for managed_node3/shell 13273 1726853335.97554: worker is 1 (out of 1 available) 13273 1726853335.97584: exiting _queue_task() for managed_node3/shell 13273 1726853335.97613: done queuing things up, now waiting for results queue to drain 13273 1726853335.97614: waiting for pending results... 13273 1726853335.97863: running TaskExecutor() for managed_node3/TASK: Stop dnsmasq/radvd services 13273 1726853335.98029: in run() - task 02083763-bbaf-5fc3-657d-0000000001b6 13273 1726853335.98041: variable 'ansible_search_path' from source: unknown 13273 1726853335.98044: variable 'ansible_search_path' from source: unknown 13273 1726853335.98102: calling self._execute() 13273 1726853335.98187: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853335.98208: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853335.98217: variable 'omit' from source: magic vars 13273 1726853335.98650: variable 'ansible_distribution_major_version' from source: facts 13273 1726853335.98696: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853335.98700: variable 'omit' from source: magic vars 13273 1726853335.98725: variable 'omit' from source: magic vars 13273 1726853335.98788: variable 'omit' from source: magic vars 13273 1726853335.98839: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13273 1726853335.98860: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13273 1726853335.98912: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13273 1726853335.98916: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853335.98948: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853335.98955: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13273 1726853335.98959: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853335.98961: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853335.99092: Set connection var ansible_connection to ssh 13273 1726853335.99111: Set connection var ansible_module_compression to ZIP_DEFLATED 13273 1726853335.99115: Set connection var ansible_shell_executable to /bin/sh 13273 1726853335.99117: Set connection var ansible_shell_type to sh 13273 1726853335.99119: Set connection var ansible_pipelining to False 13273 1726853335.99121: Set connection var ansible_timeout to 10 13273 1726853335.99124: variable 'ansible_shell_executable' from source: unknown 13273 1726853335.99126: variable 'ansible_connection' from source: unknown 13273 1726853335.99164: variable 'ansible_module_compression' from source: unknown 13273 1726853335.99167: variable 'ansible_shell_type' from source: unknown 13273 1726853335.99169: variable 'ansible_shell_executable' from source: unknown 13273 1726853335.99173: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853335.99176: variable 'ansible_pipelining' from source: unknown 13273 1726853335.99178: variable 'ansible_timeout' from source: unknown 13273 1726853335.99180: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853335.99404: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13273 1726853335.99408: variable 'omit' from source: magic vars 13273 1726853335.99410: starting attempt loop 13273 1726853335.99412: running the handler 13273 1726853335.99415: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13273 1726853335.99482: _low_level_execute_command(): starting 13273 1726853335.99489: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13273 1726853336.00687: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853336.00692: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853336.00694: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13273 1726853336.00697: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address <<< 13273 1726853336.00770: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13273 1726853336.00783: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853336.00787: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 13273 1726853336.00789: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853336.00792: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853336.00818: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853336.00882: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853336.02601: stdout chunk (state=3): >>>/root <<< 13273 1726853336.02720: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853336.02758: stderr chunk (state=3): >>><<< 13273 1726853336.02762: stdout chunk (state=3): >>><<< 13273 1726853336.02819: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853336.02823: _low_level_execute_command(): starting 13273 1726853336.02827: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853336.0278842-15767-188781409831649 `" && echo ansible-tmp-1726853336.0278842-15767-188781409831649="` echo /root/.ansible/tmp/ansible-tmp-1726853336.0278842-15767-188781409831649 `" ) && sleep 0' 13273 1726853336.03317: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853336.03320: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853336.03323: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 13273 1726853336.03326: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853336.03368: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853336.03372: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853336.03437: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853336.05356: stdout chunk (state=3): >>>ansible-tmp-1726853336.0278842-15767-188781409831649=/root/.ansible/tmp/ansible-tmp-1726853336.0278842-15767-188781409831649 <<< 13273 1726853336.05463: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853336.05498: stderr chunk (state=3): >>><<< 13273 1726853336.05501: stdout chunk (state=3): >>><<< 13273 1726853336.05516: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853336.0278842-15767-188781409831649=/root/.ansible/tmp/ansible-tmp-1726853336.0278842-15767-188781409831649 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853336.05542: variable 'ansible_module_compression' from source: unknown 13273 1726853336.05595: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13273pukop8ph/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 13273 1726853336.05626: variable 'ansible_facts' from source: unknown 13273 1726853336.05686: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853336.0278842-15767-188781409831649/AnsiballZ_command.py 13273 1726853336.05792: Sending initial data 13273 1726853336.05795: Sent initial data (156 bytes) 13273 1726853336.06235: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853336.06268: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13273 1726853336.06275: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853336.06278: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853336.06280: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853336.06332: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853336.06336: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853336.06340: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853336.06400: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853336.08000: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 13273 1726853336.08009: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13273 1726853336.08058: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13273 1726853336.08118: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13273pukop8ph/tmpc3_saiqc /root/.ansible/tmp/ansible-tmp-1726853336.0278842-15767-188781409831649/AnsiballZ_command.py <<< 13273 1726853336.08121: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853336.0278842-15767-188781409831649/AnsiballZ_command.py" <<< 13273 1726853336.08173: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13273pukop8ph/tmpc3_saiqc" to remote "/root/.ansible/tmp/ansible-tmp-1726853336.0278842-15767-188781409831649/AnsiballZ_command.py" <<< 13273 1726853336.08176: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853336.0278842-15767-188781409831649/AnsiballZ_command.py" <<< 13273 1726853336.08949: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853336.08994: stderr chunk (state=3): >>><<< 13273 1726853336.08997: stdout chunk (state=3): >>><<< 13273 1726853336.09023: done transferring module to remote 13273 1726853336.09048: _low_level_execute_command(): starting 13273 1726853336.09058: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853336.0278842-15767-188781409831649/ /root/.ansible/tmp/ansible-tmp-1726853336.0278842-15767-188781409831649/AnsiballZ_command.py && sleep 0' 13273 1726853336.09668: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853336.09677: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853336.09712: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853336.09715: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853336.09718: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853336.09791: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853336.11662: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853336.11692: stderr chunk (state=3): >>><<< 13273 1726853336.11696: stdout chunk (state=3): >>><<< 13273 1726853336.11710: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853336.11713: _low_level_execute_command(): starting 13273 1726853336.11719: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853336.0278842-15767-188781409831649/AnsiballZ_command.py && sleep 0' 13273 1726853336.12242: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853336.12245: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 13273 1726853336.12248: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853336.12250: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853336.12252: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853336.12304: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853336.12312: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853336.12376: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853336.30760: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "+ exec\n+ pkill -F /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.lease\n+ grep 'release 6' /etc/redhat-release\n+ systemctl is-active firewalld\ninactive", "rc": 0, "cmd": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "start": "2024-09-20 13:28:56.277289", "end": "2024-09-20 13:28:56.304884", "delta": "0:00:00.027595", "msg": "", "invocation": {"module_args": {"_raw_params": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 13273 1726853336.32742: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 13273 1726853336.32749: stdout chunk (state=3): >>><<< 13273 1726853336.32752: stderr chunk (state=3): >>><<< 13273 1726853336.32754: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "+ exec\n+ pkill -F /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.lease\n+ grep 'release 6' /etc/redhat-release\n+ systemctl is-active firewalld\ninactive", "rc": 0, "cmd": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "start": "2024-09-20 13:28:56.277289", "end": "2024-09-20 13:28:56.304884", "delta": "0:00:00.027595", "msg": "", "invocation": {"module_args": {"_raw_params": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 13273 1726853336.32763: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep \'release 6\' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service="$service"; then\n firewall-cmd --remove-service "$service"\n fi\n done\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853336.0278842-15767-188781409831649/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13273 1726853336.32765: _low_level_execute_command(): starting 13273 1726853336.32768: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853336.0278842-15767-188781409831649/ > /dev/null 2>&1 && sleep 0' 13273 1726853336.33687: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853336.33691: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853336.33713: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853336.33716: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13273 1726853336.33718: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853336.33781: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853336.33784: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853336.33908: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853336.34002: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853336.36054: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853336.36070: stdout chunk (state=3): >>><<< 13273 1726853336.36086: stderr chunk (state=3): >>><<< 13273 1726853336.36477: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853336.36481: handler run complete 13273 1726853336.36484: Evaluated conditional (False): False 13273 1726853336.36486: attempt loop complete, returning result 13273 1726853336.36488: _execute() done 13273 1726853336.36490: dumping result to json 13273 1726853336.36492: done dumping result, returning 13273 1726853336.36494: done running TaskExecutor() for managed_node3/TASK: Stop dnsmasq/radvd services [02083763-bbaf-5fc3-657d-0000000001b6] 13273 1726853336.36496: sending task result for task 02083763-bbaf-5fc3-657d-0000000001b6 13273 1726853336.36575: done sending task result for task 02083763-bbaf-5fc3-657d-0000000001b6 13273 1726853336.36580: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "delta": "0:00:00.027595", "end": "2024-09-20 13:28:56.304884", "rc": 0, "start": "2024-09-20 13:28:56.277289" } STDERR: + exec + pkill -F /run/dhcp_testbr.pid + rm -rf /run/dhcp_testbr.pid + rm -rf /run/dhcp_testbr.lease + grep 'release 6' /etc/redhat-release + systemctl is-active firewalld inactive 13273 1726853336.36648: no more pending results, returning what we have 13273 1726853336.36652: results queue empty 13273 1726853336.36653: checking for any_errors_fatal 13273 1726853336.36668: done checking for any_errors_fatal 13273 1726853336.36669: checking for max_fail_percentage 13273 1726853336.36672: done checking for max_fail_percentage 13273 1726853336.36674: checking to see if all hosts have failed and the running result is not ok 13273 1726853336.36674: done checking to see if all hosts have failed 13273 1726853336.36675: getting the remaining hosts for this loop 13273 1726853336.36677: done getting the remaining hosts for this loop 13273 1726853336.36680: getting the next task for host managed_node3 13273 1726853336.36689: done getting next task for host managed_node3 13273 1726853336.36692: ^ task is: TASK: Restore the /etc/resolv.conf for initscript 13273 1726853336.36695: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13273 1726853336.36700: getting variables 13273 1726853336.36701: in VariableManager get_vars() 13273 1726853336.36753: Calling all_inventory to load vars for managed_node3 13273 1726853336.36756: Calling groups_inventory to load vars for managed_node3 13273 1726853336.36758: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853336.36769: Calling all_plugins_play to load vars for managed_node3 13273 1726853336.36989: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853336.36994: Calling groups_plugins_play to load vars for managed_node3 13273 1726853336.38741: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853336.40596: done with get_vars() 13273 1726853336.40618: done getting variables 13273 1726853336.40798: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Restore the /etc/resolv.conf for initscript] ***************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:248 Friday 20 September 2024 13:28:56 -0400 (0:00:00.439) 0:00:54.297 ****** 13273 1726853336.40835: entering _queue_task() for managed_node3/command 13273 1726853336.41580: worker is 1 (out of 1 available) 13273 1726853336.41594: exiting _queue_task() for managed_node3/command 13273 1726853336.41607: done queuing things up, now waiting for results queue to drain 13273 1726853336.41608: waiting for pending results... 13273 1726853336.42085: running TaskExecutor() for managed_node3/TASK: Restore the /etc/resolv.conf for initscript 13273 1726853336.42284: in run() - task 02083763-bbaf-5fc3-657d-0000000001b7 13273 1726853336.42289: variable 'ansible_search_path' from source: unknown 13273 1726853336.42292: calling self._execute() 13273 1726853336.42311: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853336.42319: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853336.42341: variable 'omit' from source: magic vars 13273 1726853336.42666: variable 'ansible_distribution_major_version' from source: facts 13273 1726853336.42677: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853336.42758: variable 'network_provider' from source: set_fact 13273 1726853336.42777: Evaluated conditional (network_provider == "initscripts"): False 13273 1726853336.42781: when evaluation is False, skipping this task 13273 1726853336.42784: _execute() done 13273 1726853336.42786: dumping result to json 13273 1726853336.42788: done dumping result, returning 13273 1726853336.42790: done running TaskExecutor() for managed_node3/TASK: Restore the /etc/resolv.conf for initscript [02083763-bbaf-5fc3-657d-0000000001b7] 13273 1726853336.42792: sending task result for task 02083763-bbaf-5fc3-657d-0000000001b7 13273 1726853336.42881: done sending task result for task 02083763-bbaf-5fc3-657d-0000000001b7 skipping: [managed_node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 13273 1726853336.42926: no more pending results, returning what we have 13273 1726853336.42930: results queue empty 13273 1726853336.42931: checking for any_errors_fatal 13273 1726853336.42939: done checking for any_errors_fatal 13273 1726853336.42940: checking for max_fail_percentage 13273 1726853336.42942: done checking for max_fail_percentage 13273 1726853336.42943: checking to see if all hosts have failed and the running result is not ok 13273 1726853336.42944: done checking to see if all hosts have failed 13273 1726853336.42945: getting the remaining hosts for this loop 13273 1726853336.42948: done getting the remaining hosts for this loop 13273 1726853336.42952: getting the next task for host managed_node3 13273 1726853336.42959: done getting next task for host managed_node3 13273 1726853336.42962: ^ task is: TASK: Verify network state restored to default 13273 1726853336.42965: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13273 1726853336.42969: getting variables 13273 1726853336.42972: in VariableManager get_vars() 13273 1726853336.43019: Calling all_inventory to load vars for managed_node3 13273 1726853336.43022: Calling groups_inventory to load vars for managed_node3 13273 1726853336.43025: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853336.43034: Calling all_plugins_play to load vars for managed_node3 13273 1726853336.43036: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853336.43038: Calling groups_plugins_play to load vars for managed_node3 13273 1726853336.43584: WORKER PROCESS EXITING 13273 1726853336.44386: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853336.47113: done with get_vars() 13273 1726853336.47137: done getting variables TASK [Verify network state restored to default] ******************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:253 Friday 20 September 2024 13:28:56 -0400 (0:00:00.064) 0:00:54.361 ****** 13273 1726853336.47244: entering _queue_task() for managed_node3/include_tasks 13273 1726853336.47607: worker is 1 (out of 1 available) 13273 1726853336.47621: exiting _queue_task() for managed_node3/include_tasks 13273 1726853336.47749: done queuing things up, now waiting for results queue to drain 13273 1726853336.47751: waiting for pending results... 13273 1726853336.47963: running TaskExecutor() for managed_node3/TASK: Verify network state restored to default 13273 1726853336.48106: in run() - task 02083763-bbaf-5fc3-657d-0000000001b8 13273 1726853336.48129: variable 'ansible_search_path' from source: unknown 13273 1726853336.48178: calling self._execute() 13273 1726853336.48309: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853336.48390: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853336.48393: variable 'omit' from source: magic vars 13273 1726853336.49081: variable 'ansible_distribution_major_version' from source: facts 13273 1726853336.49086: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853336.49092: _execute() done 13273 1726853336.49095: dumping result to json 13273 1726853336.49097: done dumping result, returning 13273 1726853336.49099: done running TaskExecutor() for managed_node3/TASK: Verify network state restored to default [02083763-bbaf-5fc3-657d-0000000001b8] 13273 1726853336.49101: sending task result for task 02083763-bbaf-5fc3-657d-0000000001b8 13273 1726853336.49240: no more pending results, returning what we have 13273 1726853336.49249: in VariableManager get_vars() 13273 1726853336.49324: Calling all_inventory to load vars for managed_node3 13273 1726853336.49328: Calling groups_inventory to load vars for managed_node3 13273 1726853336.49331: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853336.49349: Calling all_plugins_play to load vars for managed_node3 13273 1726853336.49354: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853336.49357: Calling groups_plugins_play to load vars for managed_node3 13273 1726853336.50520: done sending task result for task 02083763-bbaf-5fc3-657d-0000000001b8 13273 1726853336.50524: WORKER PROCESS EXITING 13273 1726853336.51723: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853336.53365: done with get_vars() 13273 1726853336.53387: variable 'ansible_search_path' from source: unknown 13273 1726853336.53403: we have included files to process 13273 1726853336.53404: generating all_blocks data 13273 1726853336.53406: done generating all_blocks data 13273 1726853336.53410: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 13273 1726853336.53411: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 13273 1726853336.53414: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 13273 1726853336.53853: done processing included file 13273 1726853336.53855: iterating over new_blocks loaded from include file 13273 1726853336.53857: in VariableManager get_vars() 13273 1726853336.53888: done with get_vars() 13273 1726853336.53890: filtering new block on tags 13273 1726853336.53925: done filtering new block on tags 13273 1726853336.53927: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml for managed_node3 13273 1726853336.53933: extending task lists for all hosts with included blocks 13273 1726853336.55263: done extending task lists 13273 1726853336.55265: done processing included files 13273 1726853336.55266: results queue empty 13273 1726853336.55266: checking for any_errors_fatal 13273 1726853336.55269: done checking for any_errors_fatal 13273 1726853336.55272: checking for max_fail_percentage 13273 1726853336.55274: done checking for max_fail_percentage 13273 1726853336.55274: checking to see if all hosts have failed and the running result is not ok 13273 1726853336.55275: done checking to see if all hosts have failed 13273 1726853336.55276: getting the remaining hosts for this loop 13273 1726853336.55277: done getting the remaining hosts for this loop 13273 1726853336.55283: getting the next task for host managed_node3 13273 1726853336.55287: done getting next task for host managed_node3 13273 1726853336.55289: ^ task is: TASK: Check routes and DNS 13273 1726853336.55292: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13273 1726853336.55295: getting variables 13273 1726853336.55296: in VariableManager get_vars() 13273 1726853336.55316: Calling all_inventory to load vars for managed_node3 13273 1726853336.55318: Calling groups_inventory to load vars for managed_node3 13273 1726853336.55321: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853336.55326: Calling all_plugins_play to load vars for managed_node3 13273 1726853336.55328: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853336.55331: Calling groups_plugins_play to load vars for managed_node3 13273 1726853336.56658: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853336.58250: done with get_vars() 13273 1726853336.58274: done getting variables 13273 1726853336.58322: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Check routes and DNS] **************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:6 Friday 20 September 2024 13:28:56 -0400 (0:00:00.111) 0:00:54.472 ****** 13273 1726853336.58355: entering _queue_task() for managed_node3/shell 13273 1726853336.58791: worker is 1 (out of 1 available) 13273 1726853336.58804: exiting _queue_task() for managed_node3/shell 13273 1726853336.58814: done queuing things up, now waiting for results queue to drain 13273 1726853336.58815: waiting for pending results... 13273 1726853336.59082: running TaskExecutor() for managed_node3/TASK: Check routes and DNS 13273 1726853336.59216: in run() - task 02083763-bbaf-5fc3-657d-0000000009f0 13273 1726853336.59236: variable 'ansible_search_path' from source: unknown 13273 1726853336.59243: variable 'ansible_search_path' from source: unknown 13273 1726853336.59286: calling self._execute() 13273 1726853336.59400: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853336.59419: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853336.59438: variable 'omit' from source: magic vars 13273 1726853336.59859: variable 'ansible_distribution_major_version' from source: facts 13273 1726853336.59879: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853336.59892: variable 'omit' from source: magic vars 13273 1726853336.59945: variable 'omit' from source: magic vars 13273 1726853336.59998: variable 'omit' from source: magic vars 13273 1726853336.60040: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13273 1726853336.60092: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13273 1726853336.60116: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13273 1726853336.60180: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853336.60186: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853336.60200: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13273 1726853336.60208: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853336.60216: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853336.60324: Set connection var ansible_connection to ssh 13273 1726853336.60339: Set connection var ansible_module_compression to ZIP_DEFLATED 13273 1726853336.60352: Set connection var ansible_shell_executable to /bin/sh 13273 1726853336.60395: Set connection var ansible_shell_type to sh 13273 1726853336.60403: Set connection var ansible_pipelining to False 13273 1726853336.60405: Set connection var ansible_timeout to 10 13273 1726853336.60416: variable 'ansible_shell_executable' from source: unknown 13273 1726853336.60423: variable 'ansible_connection' from source: unknown 13273 1726853336.60430: variable 'ansible_module_compression' from source: unknown 13273 1726853336.60436: variable 'ansible_shell_type' from source: unknown 13273 1726853336.60443: variable 'ansible_shell_executable' from source: unknown 13273 1726853336.60504: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853336.60508: variable 'ansible_pipelining' from source: unknown 13273 1726853336.60510: variable 'ansible_timeout' from source: unknown 13273 1726853336.60513: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853336.60722: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13273 1726853336.60725: variable 'omit' from source: magic vars 13273 1726853336.60728: starting attempt loop 13273 1726853336.60731: running the handler 13273 1726853336.60734: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13273 1726853336.60736: _low_level_execute_command(): starting 13273 1726853336.60738: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13273 1726853336.61550: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853336.61600: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853336.61621: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853336.61644: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853336.61722: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853336.63464: stdout chunk (state=3): >>>/root <<< 13273 1726853336.63627: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853336.63631: stdout chunk (state=3): >>><<< 13273 1726853336.63633: stderr chunk (state=3): >>><<< 13273 1726853336.63655: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853336.63754: _low_level_execute_command(): starting 13273 1726853336.63758: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853336.6366298-15803-128439459163767 `" && echo ansible-tmp-1726853336.6366298-15803-128439459163767="` echo /root/.ansible/tmp/ansible-tmp-1726853336.6366298-15803-128439459163767 `" ) && sleep 0' 13273 1726853336.64363: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853336.64377: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853336.64466: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853336.66398: stdout chunk (state=3): >>>ansible-tmp-1726853336.6366298-15803-128439459163767=/root/.ansible/tmp/ansible-tmp-1726853336.6366298-15803-128439459163767 <<< 13273 1726853336.66516: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853336.66534: stderr chunk (state=3): >>><<< 13273 1726853336.66537: stdout chunk (state=3): >>><<< 13273 1726853336.66554: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853336.6366298-15803-128439459163767=/root/.ansible/tmp/ansible-tmp-1726853336.6366298-15803-128439459163767 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853336.66582: variable 'ansible_module_compression' from source: unknown 13273 1726853336.66629: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13273pukop8ph/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 13273 1726853336.66660: variable 'ansible_facts' from source: unknown 13273 1726853336.66721: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853336.6366298-15803-128439459163767/AnsiballZ_command.py 13273 1726853336.66814: Sending initial data 13273 1726853336.66817: Sent initial data (156 bytes) 13273 1726853336.67353: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13273 1726853336.67367: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853336.67396: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853336.67486: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853336.69072: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13273 1726853336.69141: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13273 1726853336.69217: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13273pukop8ph/tmpzcdly5e6 /root/.ansible/tmp/ansible-tmp-1726853336.6366298-15803-128439459163767/AnsiballZ_command.py <<< 13273 1726853336.69220: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853336.6366298-15803-128439459163767/AnsiballZ_command.py" <<< 13273 1726853336.69295: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13273pukop8ph/tmpzcdly5e6" to remote "/root/.ansible/tmp/ansible-tmp-1726853336.6366298-15803-128439459163767/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853336.6366298-15803-128439459163767/AnsiballZ_command.py" <<< 13273 1726853336.70393: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853336.70482: stderr chunk (state=3): >>><<< 13273 1726853336.70486: stdout chunk (state=3): >>><<< 13273 1726853336.70488: done transferring module to remote 13273 1726853336.70490: _low_level_execute_command(): starting 13273 1726853336.70493: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853336.6366298-15803-128439459163767/ /root/.ansible/tmp/ansible-tmp-1726853336.6366298-15803-128439459163767/AnsiballZ_command.py && sleep 0' 13273 1726853336.71092: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13273 1726853336.71150: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853336.71219: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853336.71262: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853336.71282: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853336.71370: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853336.73444: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853336.73451: stdout chunk (state=3): >>><<< 13273 1726853336.73454: stderr chunk (state=3): >>><<< 13273 1726853336.73456: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853336.73463: _low_level_execute_command(): starting 13273 1726853336.73466: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853336.6366298-15803-128439459163767/AnsiballZ_command.py && sleep 0' 13273 1726853336.74117: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853336.74121: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 13273 1726853336.74124: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853336.74126: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853336.74136: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853336.74160: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853336.74260: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853336.90661: stdout chunk (state=3): >>> {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 12:2a:53:36:f0:e9 brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.11.217/22 brd 10.31.11.255 scope global dynamic noprefixroute eth0\n valid_lft 3143sec preferred_lft 3143sec\n inet6 fe80::102a:53ff:fe36:f0e9/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.8.1 dev eth0 proto dhcp src 10.31.11.217 metric 100 \n10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.11.217 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-20 13:28:56.896208", "end": "2024-09-20 13:28:56.905112", "delta": "0:00:00.008904", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 13273 1726853336.92225: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 13273 1726853336.92253: stderr chunk (state=3): >>><<< 13273 1726853336.92257: stdout chunk (state=3): >>><<< 13273 1726853336.92283: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 12:2a:53:36:f0:e9 brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.11.217/22 brd 10.31.11.255 scope global dynamic noprefixroute eth0\n valid_lft 3143sec preferred_lft 3143sec\n inet6 fe80::102a:53ff:fe36:f0e9/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.8.1 dev eth0 proto dhcp src 10.31.11.217 metric 100 \n10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.11.217 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-20 13:28:56.896208", "end": "2024-09-20 13:28:56.905112", "delta": "0:00:00.008904", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 13273 1726853336.92324: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853336.6366298-15803-128439459163767/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13273 1726853336.92332: _low_level_execute_command(): starting 13273 1726853336.92337: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853336.6366298-15803-128439459163767/ > /dev/null 2>&1 && sleep 0' 13273 1726853336.92843: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853336.92849: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 13273 1726853336.92851: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration <<< 13273 1726853336.92853: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 13273 1726853336.92855: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853336.92904: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853336.92909: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853336.92990: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853336.94816: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853336.94835: stderr chunk (state=3): >>><<< 13273 1726853336.94838: stdout chunk (state=3): >>><<< 13273 1726853336.94865: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853336.94868: handler run complete 13273 1726853336.94888: Evaluated conditional (False): False 13273 1726853336.94899: attempt loop complete, returning result 13273 1726853336.94902: _execute() done 13273 1726853336.94904: dumping result to json 13273 1726853336.94906: done dumping result, returning 13273 1726853336.94917: done running TaskExecutor() for managed_node3/TASK: Check routes and DNS [02083763-bbaf-5fc3-657d-0000000009f0] 13273 1726853336.94919: sending task result for task 02083763-bbaf-5fc3-657d-0000000009f0 13273 1726853336.95024: done sending task result for task 02083763-bbaf-5fc3-657d-0000000009f0 13273 1726853336.95027: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "delta": "0:00:00.008904", "end": "2024-09-20 13:28:56.905112", "rc": 0, "start": "2024-09-20 13:28:56.896208" } STDOUT: IP 1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 inet 127.0.0.1/8 scope host lo valid_lft forever preferred_lft forever inet6 ::1/128 scope host noprefixroute valid_lft forever preferred_lft forever 2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000 link/ether 12:2a:53:36:f0:e9 brd ff:ff:ff:ff:ff:ff altname enX0 inet 10.31.11.217/22 brd 10.31.11.255 scope global dynamic noprefixroute eth0 valid_lft 3143sec preferred_lft 3143sec inet6 fe80::102a:53ff:fe36:f0e9/64 scope link noprefixroute valid_lft forever preferred_lft forever IP ROUTE default via 10.31.8.1 dev eth0 proto dhcp src 10.31.11.217 metric 100 10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.11.217 metric 100 IP -6 ROUTE fe80::/64 dev eth0 proto kernel metric 1024 pref medium RESOLV # Generated by NetworkManager search us-east-1.aws.redhat.com nameserver 10.29.169.13 nameserver 10.29.170.12 nameserver 10.2.32.1 13273 1726853336.95101: no more pending results, returning what we have 13273 1726853336.95105: results queue empty 13273 1726853336.95106: checking for any_errors_fatal 13273 1726853336.95107: done checking for any_errors_fatal 13273 1726853336.95108: checking for max_fail_percentage 13273 1726853336.95110: done checking for max_fail_percentage 13273 1726853336.95111: checking to see if all hosts have failed and the running result is not ok 13273 1726853336.95111: done checking to see if all hosts have failed 13273 1726853336.95112: getting the remaining hosts for this loop 13273 1726853336.95114: done getting the remaining hosts for this loop 13273 1726853336.95117: getting the next task for host managed_node3 13273 1726853336.95124: done getting next task for host managed_node3 13273 1726853336.95126: ^ task is: TASK: Verify DNS and network connectivity 13273 1726853336.95130: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13273 1726853336.95135: getting variables 13273 1726853336.95138: in VariableManager get_vars() 13273 1726853336.95191: Calling all_inventory to load vars for managed_node3 13273 1726853336.95193: Calling groups_inventory to load vars for managed_node3 13273 1726853336.95195: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853336.95205: Calling all_plugins_play to load vars for managed_node3 13273 1726853336.95207: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853336.95209: Calling groups_plugins_play to load vars for managed_node3 13273 1726853336.96220: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853336.97162: done with get_vars() 13273 1726853336.97181: done getting variables 13273 1726853336.97222: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Verify DNS and network connectivity] ************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 Friday 20 September 2024 13:28:56 -0400 (0:00:00.388) 0:00:54.861 ****** 13273 1726853336.97249: entering _queue_task() for managed_node3/shell 13273 1726853336.97488: worker is 1 (out of 1 available) 13273 1726853336.97501: exiting _queue_task() for managed_node3/shell 13273 1726853336.97514: done queuing things up, now waiting for results queue to drain 13273 1726853336.97515: waiting for pending results... 13273 1726853336.97717: running TaskExecutor() for managed_node3/TASK: Verify DNS and network connectivity 13273 1726853336.97800: in run() - task 02083763-bbaf-5fc3-657d-0000000009f1 13273 1726853336.97811: variable 'ansible_search_path' from source: unknown 13273 1726853336.97815: variable 'ansible_search_path' from source: unknown 13273 1726853336.97844: calling self._execute() 13273 1726853336.97925: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853336.97930: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853336.97938: variable 'omit' from source: magic vars 13273 1726853336.98230: variable 'ansible_distribution_major_version' from source: facts 13273 1726853336.98240: Evaluated conditional (ansible_distribution_major_version != '6'): True 13273 1726853336.98338: variable 'ansible_facts' from source: unknown 13273 1726853336.98878: Evaluated conditional (ansible_facts["distribution"] == "CentOS"): True 13273 1726853336.98882: variable 'omit' from source: magic vars 13273 1726853336.98927: variable 'omit' from source: magic vars 13273 1726853336.98949: variable 'omit' from source: magic vars 13273 1726853336.98982: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13273 1726853336.99009: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13273 1726853336.99026: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13273 1726853336.99049: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853336.99076: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13273 1726853336.99105: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13273 1726853336.99108: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853336.99110: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853336.99176: Set connection var ansible_connection to ssh 13273 1726853336.99184: Set connection var ansible_module_compression to ZIP_DEFLATED 13273 1726853336.99187: Set connection var ansible_shell_executable to /bin/sh 13273 1726853336.99191: Set connection var ansible_shell_type to sh 13273 1726853336.99197: Set connection var ansible_pipelining to False 13273 1726853336.99206: Set connection var ansible_timeout to 10 13273 1726853336.99223: variable 'ansible_shell_executable' from source: unknown 13273 1726853336.99226: variable 'ansible_connection' from source: unknown 13273 1726853336.99228: variable 'ansible_module_compression' from source: unknown 13273 1726853336.99231: variable 'ansible_shell_type' from source: unknown 13273 1726853336.99233: variable 'ansible_shell_executable' from source: unknown 13273 1726853336.99235: variable 'ansible_host' from source: host vars for 'managed_node3' 13273 1726853336.99239: variable 'ansible_pipelining' from source: unknown 13273 1726853336.99242: variable 'ansible_timeout' from source: unknown 13273 1726853336.99250: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13273 1726853336.99360: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13273 1726853336.99369: variable 'omit' from source: magic vars 13273 1726853336.99375: starting attempt loop 13273 1726853336.99377: running the handler 13273 1726853336.99387: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13273 1726853336.99404: _low_level_execute_command(): starting 13273 1726853336.99413: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13273 1726853336.99958: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853336.99961: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853336.99964: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853336.99966: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 13273 1726853336.99969: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853337.00032: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853337.00035: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853337.00104: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853337.01803: stdout chunk (state=3): >>>/root <<< 13273 1726853337.01938: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853337.01951: stderr chunk (state=3): >>><<< 13273 1726853337.01954: stdout chunk (state=3): >>><<< 13273 1726853337.01974: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853337.01986: _low_level_execute_command(): starting 13273 1726853337.02003: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853337.019747-15825-137520694570255 `" && echo ansible-tmp-1726853337.019747-15825-137520694570255="` echo /root/.ansible/tmp/ansible-tmp-1726853337.019747-15825-137520694570255 `" ) && sleep 0' 13273 1726853337.02526: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853337.02529: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 13273 1726853337.02531: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853337.02533: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853337.02536: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853337.02597: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853337.02601: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853337.02657: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853337.04618: stdout chunk (state=3): >>>ansible-tmp-1726853337.019747-15825-137520694570255=/root/.ansible/tmp/ansible-tmp-1726853337.019747-15825-137520694570255 <<< 13273 1726853337.04724: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853337.04777: stderr chunk (state=3): >>><<< 13273 1726853337.04781: stdout chunk (state=3): >>><<< 13273 1726853337.04783: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853337.019747-15825-137520694570255=/root/.ansible/tmp/ansible-tmp-1726853337.019747-15825-137520694570255 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853337.04795: variable 'ansible_module_compression' from source: unknown 13273 1726853337.04838: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13273pukop8ph/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 13273 1726853337.04874: variable 'ansible_facts' from source: unknown 13273 1726853337.04931: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853337.019747-15825-137520694570255/AnsiballZ_command.py 13273 1726853337.05031: Sending initial data 13273 1726853337.05035: Sent initial data (155 bytes) 13273 1726853337.05458: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853337.05495: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 13273 1726853337.05500: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853337.05502: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13273 1726853337.05506: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 13273 1726853337.05509: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853337.05551: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853337.05554: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853337.05623: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853337.07254: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13273 1726853337.07312: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13273 1726853337.07390: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13273pukop8ph/tmpan0kjkiy /root/.ansible/tmp/ansible-tmp-1726853337.019747-15825-137520694570255/AnsiballZ_command.py <<< 13273 1726853337.07394: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853337.019747-15825-137520694570255/AnsiballZ_command.py" <<< 13273 1726853337.07436: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13273pukop8ph/tmpan0kjkiy" to remote "/root/.ansible/tmp/ansible-tmp-1726853337.019747-15825-137520694570255/AnsiballZ_command.py" <<< 13273 1726853337.07448: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853337.019747-15825-137520694570255/AnsiballZ_command.py" <<< 13273 1726853337.08066: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853337.08105: stderr chunk (state=3): >>><<< 13273 1726853337.08108: stdout chunk (state=3): >>><<< 13273 1726853337.08151: done transferring module to remote 13273 1726853337.08157: _low_level_execute_command(): starting 13273 1726853337.08162: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853337.019747-15825-137520694570255/ /root/.ansible/tmp/ansible-tmp-1726853337.019747-15825-137520694570255/AnsiballZ_command.py && sleep 0' 13273 1726853337.08735: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13273 1726853337.08738: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853337.08749: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853337.08760: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13273 1726853337.08775: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853337.08804: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration <<< 13273 1726853337.08809: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853337.08881: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853337.08888: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853337.08962: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853337.10814: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853337.10835: stderr chunk (state=3): >>><<< 13273 1726853337.10838: stdout chunk (state=3): >>><<< 13273 1726853337.10853: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853337.10856: _low_level_execute_command(): starting 13273 1726853337.10860: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853337.019747-15825-137520694570255/AnsiballZ_command.py && sleep 0' 13273 1726853337.11330: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853337.11334: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 13273 1726853337.11337: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853337.11376: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853337.11411: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853337.11414: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853337.11489: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853337.43510: stdout chunk (state=3): >>> {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 4682 0 --:--:-- --:--:-- --:--:-- 4692\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 3965 0 --:--:-- --:--:-- --:--:-- 3986", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-20 13:28:57.271100", "end": "2024-09-20 13:28:57.433617", "delta": "0:00:00.162517", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 13273 1726853337.45180: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 13273 1726853337.45212: stderr chunk (state=3): >>><<< 13273 1726853337.45215: stdout chunk (state=3): >>><<< 13273 1726853337.45233: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 4682 0 --:--:-- --:--:-- --:--:-- 4692\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 3965 0 --:--:-- --:--:-- --:--:-- 3986", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-20 13:28:57.271100", "end": "2024-09-20 13:28:57.433617", "delta": "0:00:00.162517", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 13273 1726853337.45274: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts "$host"; then\n echo FAILED to lookup host "$host"\n exit 1\n fi\n if ! curl -o /dev/null https://"$host"; then\n echo FAILED to contact host "$host"\n exit 1\n fi\ndone\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853337.019747-15825-137520694570255/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13273 1726853337.45281: _low_level_execute_command(): starting 13273 1726853337.45286: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853337.019747-15825-137520694570255/ > /dev/null 2>&1 && sleep 0' 13273 1726853337.45739: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13273 1726853337.45744: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 13273 1726853337.45749: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853337.45751: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13273 1726853337.45753: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13273 1726853337.45809: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 13273 1726853337.45814: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13273 1726853337.45816: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13273 1726853337.45876: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13273 1726853337.47753: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13273 1726853337.47779: stderr chunk (state=3): >>><<< 13273 1726853337.47782: stdout chunk (state=3): >>><<< 13273 1726853337.47797: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13273 1726853337.47802: handler run complete 13273 1726853337.47820: Evaluated conditional (False): False 13273 1726853337.47829: attempt loop complete, returning result 13273 1726853337.47831: _execute() done 13273 1726853337.47835: dumping result to json 13273 1726853337.47841: done dumping result, returning 13273 1726853337.47850: done running TaskExecutor() for managed_node3/TASK: Verify DNS and network connectivity [02083763-bbaf-5fc3-657d-0000000009f1] 13273 1726853337.47853: sending task result for task 02083763-bbaf-5fc3-657d-0000000009f1 13273 1726853337.47950: done sending task result for task 02083763-bbaf-5fc3-657d-0000000009f1 13273 1726853337.47953: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "delta": "0:00:00.162517", "end": "2024-09-20 13:28:57.433617", "rc": 0, "start": "2024-09-20 13:28:57.271100" } STDOUT: CHECK DNS AND CONNECTIVITY 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org STDERR: % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 305 100 305 0 0 4682 0 --:--:-- --:--:-- --:--:-- 4692 % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 291 100 291 0 0 3965 0 --:--:-- --:--:-- --:--:-- 3986 13273 1726853337.48023: no more pending results, returning what we have 13273 1726853337.48026: results queue empty 13273 1726853337.48027: checking for any_errors_fatal 13273 1726853337.48037: done checking for any_errors_fatal 13273 1726853337.48038: checking for max_fail_percentage 13273 1726853337.48040: done checking for max_fail_percentage 13273 1726853337.48040: checking to see if all hosts have failed and the running result is not ok 13273 1726853337.48041: done checking to see if all hosts have failed 13273 1726853337.48042: getting the remaining hosts for this loop 13273 1726853337.48049: done getting the remaining hosts for this loop 13273 1726853337.48053: getting the next task for host managed_node3 13273 1726853337.48063: done getting next task for host managed_node3 13273 1726853337.48065: ^ task is: TASK: meta (flush_handlers) 13273 1726853337.48067: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853337.48089: getting variables 13273 1726853337.48091: in VariableManager get_vars() 13273 1726853337.48137: Calling all_inventory to load vars for managed_node3 13273 1726853337.48139: Calling groups_inventory to load vars for managed_node3 13273 1726853337.48141: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853337.48153: Calling all_plugins_play to load vars for managed_node3 13273 1726853337.48155: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853337.48158: Calling groups_plugins_play to load vars for managed_node3 13273 1726853337.48983: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853337.49859: done with get_vars() 13273 1726853337.49877: done getting variables 13273 1726853337.49931: in VariableManager get_vars() 13273 1726853337.49947: Calling all_inventory to load vars for managed_node3 13273 1726853337.49949: Calling groups_inventory to load vars for managed_node3 13273 1726853337.49950: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853337.49954: Calling all_plugins_play to load vars for managed_node3 13273 1726853337.49955: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853337.49957: Calling groups_plugins_play to load vars for managed_node3 13273 1726853337.51001: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853337.52956: done with get_vars() 13273 1726853337.53016: done queuing things up, now waiting for results queue to drain 13273 1726853337.53018: results queue empty 13273 1726853337.53019: checking for any_errors_fatal 13273 1726853337.53024: done checking for any_errors_fatal 13273 1726853337.53025: checking for max_fail_percentage 13273 1726853337.53026: done checking for max_fail_percentage 13273 1726853337.53027: checking to see if all hosts have failed and the running result is not ok 13273 1726853337.53028: done checking to see if all hosts have failed 13273 1726853337.53028: getting the remaining hosts for this loop 13273 1726853337.53029: done getting the remaining hosts for this loop 13273 1726853337.53032: getting the next task for host managed_node3 13273 1726853337.53037: done getting next task for host managed_node3 13273 1726853337.53039: ^ task is: TASK: meta (flush_handlers) 13273 1726853337.53040: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853337.53043: getting variables 13273 1726853337.53044: in VariableManager get_vars() 13273 1726853337.53111: Calling all_inventory to load vars for managed_node3 13273 1726853337.53114: Calling groups_inventory to load vars for managed_node3 13273 1726853337.53116: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853337.53122: Calling all_plugins_play to load vars for managed_node3 13273 1726853337.53124: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853337.53127: Calling groups_plugins_play to load vars for managed_node3 13273 1726853337.54441: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853337.56585: done with get_vars() 13273 1726853337.56610: done getting variables 13273 1726853337.56669: in VariableManager get_vars() 13273 1726853337.56694: Calling all_inventory to load vars for managed_node3 13273 1726853337.56697: Calling groups_inventory to load vars for managed_node3 13273 1726853337.56699: Calling all_plugins_inventory to load vars for managed_node3 13273 1726853337.56705: Calling all_plugins_play to load vars for managed_node3 13273 1726853337.56707: Calling groups_plugins_inventory to load vars for managed_node3 13273 1726853337.56710: Calling groups_plugins_play to load vars for managed_node3 13273 1726853337.58758: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13273 1726853337.60323: done with get_vars() 13273 1726853337.60346: done queuing things up, now waiting for results queue to drain 13273 1726853337.60348: results queue empty 13273 1726853337.60349: checking for any_errors_fatal 13273 1726853337.60350: done checking for any_errors_fatal 13273 1726853337.60351: checking for max_fail_percentage 13273 1726853337.60352: done checking for max_fail_percentage 13273 1726853337.60353: checking to see if all hosts have failed and the running result is not ok 13273 1726853337.60353: done checking to see if all hosts have failed 13273 1726853337.60354: getting the remaining hosts for this loop 13273 1726853337.60355: done getting the remaining hosts for this loop 13273 1726853337.60357: getting the next task for host managed_node3 13273 1726853337.60360: done getting next task for host managed_node3 13273 1726853337.60361: ^ task is: None 13273 1726853337.60362: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13273 1726853337.60363: done queuing things up, now waiting for results queue to drain 13273 1726853337.60364: results queue empty 13273 1726853337.60364: checking for any_errors_fatal 13273 1726853337.60365: done checking for any_errors_fatal 13273 1726853337.60366: checking for max_fail_percentage 13273 1726853337.60366: done checking for max_fail_percentage 13273 1726853337.60367: checking to see if all hosts have failed and the running result is not ok 13273 1726853337.60367: done checking to see if all hosts have failed 13273 1726853337.60369: getting the next task for host managed_node3 13273 1726853337.60373: done getting next task for host managed_node3 13273 1726853337.60374: ^ task is: None 13273 1726853337.60375: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False PLAY RECAP ********************************************************************* managed_node3 : ok=109 changed=5 unreachable=0 failed=0 skipped=120 rescued=0 ignored=0 Friday 20 September 2024 13:28:57 -0400 (0:00:00.632) 0:00:55.494 ****** =============================================================================== fedora.linux_system_roles.network : Check which services are running ---- 2.15s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Gathering Facts --------------------------------------------------------- 2.10s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_removal_nm.yml:6 fedora.linux_system_roles.network : Check which services are running ---- 1.94s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.89s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.89s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.89s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Create test interfaces -------------------------------------------------- 1.74s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:35 fedora.linux_system_roles.network : Configure networking connection profiles --- 1.15s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 fedora.linux_system_roles.network : Configure networking connection profiles --- 1.15s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 1.12s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Gathering Facts --------------------------------------------------------- 1.05s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:3 Install dnsmasq --------------------------------------------------------- 1.04s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:3 fedora.linux_system_roles.network : Check which packages are installed --- 1.02s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Gather the minimum subset of ansible_facts required by the network role test --- 0.96s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 fedora.linux_system_roles.network : Configure networking connection profiles --- 0.94s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 fedora.linux_system_roles.network : Check which packages are installed --- 0.94s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 fedora.linux_system_roles.network : Configure networking connection profiles --- 0.94s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Install pgrep, sysctl --------------------------------------------------- 0.85s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:26 Check if system is ostree ----------------------------------------------- 0.83s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 fedora.linux_system_roles.network : Check which packages are installed --- 0.82s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 13273 1726853337.60580: RUNNING CLEANUP